timm
/

Image Classification
timm
PyTorch
Safetensors
rwightman HF staff commited on
Commit
337f217
1 Parent(s): b152001

Update model config and README

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -36,8 +36,8 @@ Recipe details:
36
  - **Model Type:** Image classification / feature backbone
37
  - **Model Stats:**
38
  - Params (M): 60.4
39
- - GMACs: 16.3
40
- - Activations (M): 27.8
41
  - Image size: 256 x 256
42
  - **Papers:**
43
  - Vision Transformers Need Registers: https://arxiv.org/abs/2309.16588
@@ -137,7 +137,9 @@ output = model.forward_head(output, pre_logits=True)
137
  | model | top1 | top5 | param_count | img_size |
138
  | -------------------------------------------------- | ------ | ------ | ----------- | -------- |
139
  | [vit_mediumd_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_mediumd_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 86.202 | 97.874 | 64.11 | 256 |
140
- | [vit_betwixt_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_betwixt_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 85.418 | 97.48 | 60.4 | 256 |
 
 
141
  | [vit_mediumd_patch16_rope_reg1_gap_256.sbb_in1k](https://huggingface.co/timm/vit_mediumd_patch16_rope_reg1_gap_256.sbb_in1k) | 84.322 | 96.812 | 63.95 | 256 |
142
  | [vit_betwixt_patch16_rope_reg4_gap_256.sbb_in1k](https://huggingface.co/timm/vit_betwixt_patch16_rope_reg4_gap_256.sbb_in1k) | 83.906 | 96.684 | 60.23 | 256 |
143
  | [vit_base_patch16_rope_reg1_gap_256.sbb_in1k](https://huggingface.co/timm/vit_base_patch16_rope_reg1_gap_256.sbb_in1k) | 83.866 | 96.67 | 86.43 | 256 |
 
36
  - **Model Type:** Image classification / feature backbone
37
  - **Model Stats:**
38
  - Params (M): 60.4
39
+ - GMACs: 15.3
40
+ - Activations (M): 17.9
41
  - Image size: 256 x 256
42
  - **Papers:**
43
  - Vision Transformers Need Registers: https://arxiv.org/abs/2309.16588
 
137
  | model | top1 | top5 | param_count | img_size |
138
  | -------------------------------------------------- | ------ | ------ | ----------- | -------- |
139
  | [vit_mediumd_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_mediumd_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 86.202 | 97.874 | 64.11 | 256 |
140
+ | [vit_betwixt_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_betwixt_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 85.418 | 97.480 | 60.4 | 256 |
141
+ | [vit_medium_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_medium_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 84.930 | 97.386 | 38.88 | 256 |
142
+ | [vit_little_patch16_reg1_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_little_patch16_reg1_gap_256.sbb_in12k_ft_in1k) | 83.774 | 96.972 | 22.52 | 256 |
143
  | [vit_mediumd_patch16_rope_reg1_gap_256.sbb_in1k](https://huggingface.co/timm/vit_mediumd_patch16_rope_reg1_gap_256.sbb_in1k) | 84.322 | 96.812 | 63.95 | 256 |
144
  | [vit_betwixt_patch16_rope_reg4_gap_256.sbb_in1k](https://huggingface.co/timm/vit_betwixt_patch16_rope_reg4_gap_256.sbb_in1k) | 83.906 | 96.684 | 60.23 | 256 |
145
  | [vit_base_patch16_rope_reg1_gap_256.sbb_in1k](https://huggingface.co/timm/vit_base_patch16_rope_reg1_gap_256.sbb_in1k) | 83.866 | 96.67 | 86.43 | 256 |