FoamoftheSea commited on
Commit
d1cbd55
·
1 Parent(s): cf972e9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +33 -0
README.md CHANGED
@@ -1,3 +1,36 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+ ![Multiformer inference frame](6c7c-ff13_frame45_multitask.png)
5
+
6
+ # Multiformer
7
+
8
+ Multiformer is a multi-task vision transformer architecture designed to provide strong perception capabilities with a nimble and lightweight architecture.
9
+
10
+ - [Publication](https://natecibik.medium.com/multiformer-51b81df826b7)
11
+ - [Training Report](https://api.wandb.ai/links/indezera/fynqkt8r)
12
+ - [GitHub](https://github.com/FoamoftheSea/shift-experiments)
13
+
14
+ This model uses a custom branch of the transformers library, which can be installed easily using the instructions below.
15
+
16
+ For training and evaluation, a custom MultitaskTrainer class is used that can handle complex nested losses and successfully log them to wandb.
17
+
18
+ A [training/eval](https://github.com/FoamoftheSea/shift-experiments/blob/main/scripts/model_train_eval/train_multiformer.py) and [inference](https://github.com/FoamoftheSea/shift-experiments/blob/main/scripts/inference/multiformer_inference.py) script are both available in the project repository.
19
+
20
+ ## Setup Instructions
21
+
22
+ 1. Open a terminal and navigate to your root folder, then run
23
+ ```shell
24
+ git clone https://github.com/FoamoftheSea/shift-experiments.git
25
+ ```
26
+ 2. Follow the setup instructions for your operating system found in the [README](https://github.com/FoamoftheSea/shift-experiments/blob/main/README.md)
27
+
28
+ ## Quick Load Multiformer-M1
29
+
30
+ You should now be able to run the following code to load a Multiformer-M0 with pretrained weights:
31
+
32
+ ```python
33
+ from transformers import AutoModel
34
+
35
+ multiformer = AutoModel.from_pretrained("FoamoftheSea/multiformer-m1)
36
+ ```