arslanafzal's picture
bird_transformer_100_0.73
17bd480
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: birds_transform_full
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: validation
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7303427419354839

birds_transform_full

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Accuracy: 0.7303
  • Loss: 1.4588

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Accuracy Validation Loss
5.6427 1.0 1984 0.4519 5.2504
4.6563 2.0 3968 0.5068 4.2749
3.6656 3.0 5952 0.5454 3.3311
2.7653 4.0 7936 0.5748 2.5181
2.0465 5.0 9920 0.6300 1.9205
1.5876 6.0 11904 0.6593 1.5696
1.3174 7.0 13888 0.6870 1.3831
1.1279 8.0 15872 0.7064 1.2516
1.0051 9.0 17856 0.7067 1.1999
0.9318 10.0 19840 0.7077 1.1631
0.8294 11.0 21824 0.7089 1.1444
0.7976 12.0 23808 0.7175 1.1156
0.7084 13.0 25792 0.7218 1.1209
0.6752 14.0 27776 0.7198 1.1032
0.6641 15.0 29760 0.7198 1.1192
0.6083 16.0 31744 0.7268 1.1044
0.5703 17.0 33728 0.7248 1.1287
0.5376 18.0 35712 0.7286 1.1115
0.5073 19.0 37696 0.7218 1.1429
0.5072 20.0 39680 0.7208 1.1519
0.4945 21.0 41664 0.7228 1.1636
0.4651 22.0 43648 0.7213 1.1771
0.4408 23.0 45632 0.7233 1.1650
0.4222 24.0 47616 0.7157 1.1841
0.409 25.0 49600 0.7145 1.2150
0.403 26.0 51584 0.7152 1.2203
0.3813 27.0 53568 0.7238 1.2064
0.3756 28.0 55552 0.7177 1.2526
0.365 29.0 57536 0.7208 1.2670
0.3729 30.0 59520 0.7180 1.2659
0.36 31.0 61504 0.7127 1.2545
0.3596 32.0 63488 0.7182 1.2728
0.3606 33.0 65472 0.7180 1.2886
0.325 34.0 67456 0.7157 1.2929
0.329 35.0 69440 0.7205 1.3074
0.3431 36.0 71424 0.7185 1.3122
0.3206 37.0 73408 0.7233 1.2993
0.3137 38.0 75392 0.7220 1.3206
0.3265 39.0 77376 0.7180 1.3246
0.3332 40.0 79360 0.7240 1.3163
0.3193 41.0 81344 0.7288 1.3259
0.3242 42.0 83328 0.7215 1.3320
0.2976 43.0 85312 0.7213 1.3283
0.3191 44.0 87296 0.7195 1.3453
0.3067 45.0 89280 0.7243 1.3550
0.2994 46.0 91264 0.7240 1.3324
0.3072 47.0 93248 0.7263 1.3412
0.2932 48.0 95232 0.7245 1.3345
0.2919 49.0 97216 0.7266 1.3759
0.2922 50.0 99200 0.7225 1.3873
0.304 51.0 101184 0.7235 1.3631
0.2898 52.0 103168 0.7205 1.3819
0.2773 53.0 105152 0.7251 1.3827
0.2756 54.0 107136 0.7228 1.3770
0.2789 55.0 109120 0.7248 1.3822
0.261 56.0 111104 0.7263 1.3878
0.2593 57.0 113088 0.7240 1.3955
0.2801 58.0 115072 0.7256 1.3659
0.2632 59.0 117056 0.7301 1.3719
0.2811 60.0 119040 0.7321 1.3775
0.2267 61.0 121024 0.7256 1.3689
0.2676 62.0 123008 0.7245 1.4069
0.2523 63.0 124992 0.7230 1.4166
0.2622 64.0 126976 0.7296 1.4018
0.2467 65.0 128960 0.7256 1.4287
0.2504 66.0 130944 0.7314 1.4019
0.2468 67.0 132928 0.7303 1.4058
0.2098 68.0 134912 0.7308 1.4093
0.2382 69.0 136896 0.7293 1.4206
0.2304 70.0 138880 0.7301 1.4078
0.251 71.0 140864 0.7251 1.4275
0.237 72.0 142848 0.7288 1.4283
0.2485 73.0 144832 0.7281 1.4338
0.2229 74.0 146816 0.7253 1.4386
0.2472 75.0 148800 0.7210 1.4440
0.2149 76.0 150784 0.7230 1.4319
0.2337 77.0 152768 0.7261 1.4422
0.2063 78.0 154752 0.7268 1.4456
0.216 79.0 156736 0.7218 1.4426
0.2249 80.0 158720 0.7198 1.4533
0.2148 81.0 160704 0.7230 1.4480
0.2321 82.0 162688 0.7273 1.4416
0.2306 83.0 164672 0.7286 1.4392
0.213 84.0 166656 0.7263 1.4609
0.2202 85.0 168640 0.7266 1.4590
0.206 86.0 170624 0.7245 1.4638
0.1987 87.0 172608 0.7251 1.4626
0.2181 88.0 174592 0.7261 1.4615
0.2076 89.0 176576 0.7253 1.4665
0.1999 90.0 178560 0.7251 1.4569
0.2287 91.0 180544 0.7266 1.4591
0.1985 92.0 182528 0.7263 1.4508
0.2166 93.0 184512 0.7266 1.4621
0.1943 94.0 186496 0.7276 1.4649
0.2189 95.0 188480 0.7293 1.4555
0.1911 96.0 190464 0.7306 1.4565
0.1954 97.0 192448 0.7271 1.4624
0.2053 98.0 194432 0.7286 1.4603
0.2067 99.0 196416 0.7306 1.4589
0.1917 100.0 198400 0.7303 1.4588

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.5
  • Tokenizers 0.14.1