clauculus's picture
emotion-classification
119c4c1
|
raw
history blame
7.85 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: image_classification
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.50625

image_classification

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3344
  • Accuracy: 0.5062

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 10 2.0716 0.1187
No log 2.0 20 2.0629 0.1375
No log 3.0 30 2.0521 0.1562
No log 4.0 40 2.0437 0.2125
No log 5.0 50 2.0276 0.25
No log 6.0 60 2.0066 0.3063
No log 7.0 70 1.9779 0.3
No log 8.0 80 1.9538 0.3063
No log 9.0 90 1.9229 0.325
No log 10.0 100 1.8739 0.3563
No log 11.0 110 1.8404 0.3375
No log 12.0 120 1.7943 0.3688
No log 13.0 130 1.7616 0.35
No log 14.0 140 1.7186 0.3937
No log 15.0 150 1.6926 0.4062
No log 16.0 160 1.6778 0.4062
No log 17.0 170 1.6579 0.4062
No log 18.0 180 1.6462 0.4
No log 19.0 190 1.6143 0.4188
No log 20.0 200 1.5932 0.4313
No log 21.0 210 1.5833 0.4625
No log 22.0 220 1.5726 0.4437
No log 23.0 230 1.5545 0.4188
No log 24.0 240 1.5220 0.4688
No log 25.0 250 1.5237 0.4188
No log 26.0 260 1.5175 0.4375
No log 27.0 270 1.5008 0.4
No log 28.0 280 1.5100 0.4875
No log 29.0 290 1.4730 0.4938
No log 30.0 300 1.4842 0.5125
No log 31.0 310 1.4967 0.45
No log 32.0 320 1.4584 0.4562
No log 33.0 330 1.4458 0.4813
No log 34.0 340 1.4850 0.475
No log 35.0 350 1.4558 0.4688
No log 36.0 360 1.4438 0.5
No log 37.0 370 1.4290 0.475
No log 38.0 380 1.4347 0.4938
No log 39.0 390 1.4283 0.4437
No log 40.0 400 1.4149 0.4813
No log 41.0 410 1.3983 0.4813
No log 42.0 420 1.4079 0.45
No log 43.0 430 1.3984 0.45
No log 44.0 440 1.3866 0.5
No log 45.0 450 1.3809 0.4875
No log 46.0 460 1.3858 0.4813
No log 47.0 470 1.3981 0.4875
No log 48.0 480 1.3822 0.4813
No log 49.0 490 1.3728 0.4437
1.4038 50.0 500 1.3828 0.45
1.4038 51.0 510 1.3842 0.4813
1.4038 52.0 520 1.3460 0.4688
1.4038 53.0 530 1.3513 0.4938
1.4038 54.0 540 1.3645 0.4875
1.4038 55.0 550 1.3273 0.5062
1.4038 56.0 560 1.3470 0.525
1.4038 57.0 570 1.4006 0.45
1.4038 58.0 580 1.3259 0.5312
1.4038 59.0 590 1.3030 0.5062
1.4038 60.0 600 1.3526 0.5125
1.4038 61.0 610 1.3665 0.4625
1.4038 62.0 620 1.3689 0.4813
1.4038 63.0 630 1.3139 0.4813
1.4038 64.0 640 1.3618 0.4875
1.4038 65.0 650 1.3596 0.4938
1.4038 66.0 660 1.3360 0.4813
1.4038 67.0 670 1.3201 0.5062
1.4038 68.0 680 1.3615 0.5
1.4038 69.0 690 1.3335 0.5062
1.4038 70.0 700 1.2843 0.5687
1.4038 71.0 710 1.3697 0.4813
1.4038 72.0 720 1.2891 0.5188
1.4038 73.0 730 1.3355 0.5
1.4038 74.0 740 1.3400 0.4813
1.4038 75.0 750 1.3140 0.4938
1.4038 76.0 760 1.3492 0.4688
1.4038 77.0 770 1.2946 0.5188
1.4038 78.0 780 1.3635 0.45
1.4038 79.0 790 1.3224 0.5
1.4038 80.0 800 1.3092 0.525
1.4038 81.0 810 1.3298 0.475
1.4038 82.0 820 1.3626 0.4562
1.4038 83.0 830 1.3028 0.5375
1.4038 84.0 840 1.3025 0.5375
1.4038 85.0 850 1.3433 0.5188
1.4038 86.0 860 1.2508 0.5437
1.4038 87.0 870 1.3074 0.5062
1.4038 88.0 880 1.3227 0.4875
1.4038 89.0 890 1.3069 0.5188
1.4038 90.0 900 1.3278 0.4875
1.4038 91.0 910 1.3475 0.4875
1.4038 92.0 920 1.3310 0.4875
1.4038 93.0 930 1.3015 0.5062
1.4038 94.0 940 1.3635 0.4875
1.4038 95.0 950 1.3610 0.475
1.4038 96.0 960 1.2927 0.525
1.4038 97.0 970 1.3346 0.475
1.4038 98.0 980 1.3628 0.4625
1.4038 99.0 990 1.3301 0.4813
0.8016 100.0 1000 1.3301 0.475

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3