File size: 1,590 Bytes
5935360
 
 
44b9481
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
license: apache-2.0
---
![Multiformer inference frame](6c7c-ff13_frame45_multitask.png)

# Multiformer

Multiformer is a multi-task vision transformer architecture designed to provide strong perception capabilities with a nimble and lightweight architecture.

- [Publication](https://natecibik.medium.com/multiformer-51b81df826b7) 
- [Training Report](https://api.wandb.ai/links/indezera/fynqkt8r)
- [GitHub](https://github.com/FoamoftheSea/shift-experiments)

This model uses a custom branch of the transformers library, which can be installed easily using the instructions below. 

For training and evaluation, a custom MultitaskTrainer class is used that can handle complex nested losses and successfully log them to wandb. 

A [training/eval](https://github.com/FoamoftheSea/shift-experiments/blob/main/scripts/model_train_eval/train_multiformer.py) and [inference](https://github.com/FoamoftheSea/shift-experiments/blob/main/scripts/inference/multiformer_inference.py) script are both available in the project repository.

## Setup Instructions

1. Open a terminal and navigate to your root folder, then run
```shell
git clone https://github.com/FoamoftheSea/shift-experiments.git
```
2. Follow the setup instructions for your operating system found in the [README](https://github.com/FoamoftheSea/shift-experiments/blob/main/README.md)

## Quick Model Import

You should now be able to run the following code to load a Multiformer-M0 with pretrained weights:

```python
from transformers import AutoModel

multiformer = AutoModel.from_pretrained("FoamoftheSea/multiformer-m0")
```