dennisjooo's picture
End of training
ed770c1
|
raw
history blame
6.55 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - image_folder
metrics:
  - accuracy
  - precision
  - f1
model-index:
  - name: emotion_classification
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: image_folder
          type: image_folder
          config: FastJobs--Visual_Emotional_Analysis
          split: train
          args: FastJobs--Visual_Emotional_Analysis
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.64375
          - name: Precision
            type: precision
            value: 0.6639732142857142
          - name: F1
            type: f1
            value: 0.640682001352849

emotion_classification

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the image_folder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0750
  • Accuracy: 0.6438
  • Precision: 0.6640
  • F1: 0.6407

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision F1
2.0755 1.0 10 2.0787 0.1437 0.1529 0.1414
2.0711 2.0 20 2.0698 0.1875 0.1926 0.1832
2.0533 3.0 30 2.0520 0.2 0.2127 0.1961
2.0225 4.0 40 2.0173 0.225 0.2228 0.2054
1.9569 5.0 50 1.9289 0.2812 0.3345 0.2544
1.8501 6.0 60 1.7792 0.3688 0.4904 0.3225
1.7072 7.0 70 1.6236 0.4313 0.4131 0.3883
1.6065 8.0 80 1.5276 0.45 0.4533 0.3920
1.539 9.0 90 1.4747 0.4938 0.4748 0.4563
1.5086 10.0 100 1.4393 0.4938 0.4557 0.4466
1.4479 11.0 110 1.3893 0.5188 0.4563 0.4696
1.3683 12.0 120 1.3534 0.5437 0.5081 0.5149
1.3288 13.0 130 1.3392 0.5563 0.5569 0.5323
1.2514 14.0 140 1.2723 0.5625 0.5467 0.5246
1.2116 15.0 150 1.2526 0.5875 0.5554 0.5601
1.1824 16.0 160 1.2047 0.5938 0.6100 0.5697
1.1323 17.0 170 1.1950 0.5813 0.5331 0.5472
1.0782 18.0 180 1.1802 0.5875 0.5911 0.5807
1.0304 19.0 190 1.1534 0.6125 0.6133 0.6012
0.982 20.0 200 1.1302 0.6 0.5923 0.5806
0.9309 21.0 210 1.1849 0.5938 0.6157 0.5723
0.9205 22.0 220 1.1483 0.6 0.6137 0.5882
0.8275 23.0 230 1.1332 0.5938 0.6192 0.5894
0.8472 24.0 240 1.1195 0.625 0.6444 0.6242
0.7974 25.0 250 1.1444 0.6062 0.6277 0.6035
0.7532 26.0 260 1.1312 0.5875 0.6036 0.5832
0.7596 27.0 270 1.1217 0.6062 0.6412 0.6098
0.6818 28.0 280 1.1736 0.5625 0.6180 0.5473
0.6484 29.0 290 1.1630 0.5563 0.5887 0.5367
0.6578 30.0 300 1.0750 0.6438 0.6640 0.6407
0.6235 31.0 310 1.0676 0.6438 0.6556 0.6422
0.5966 32.0 320 1.0531 0.6438 0.6421 0.6385
0.5819 33.0 330 1.1244 0.6188 0.6315 0.6176
0.5585 34.0 340 1.1466 0.5813 0.6136 0.5790
0.5696 35.0 350 1.0703 0.6438 0.6614 0.6481
0.5476 36.0 360 1.1136 0.6438 0.6764 0.6466
0.475 37.0 370 1.1122 0.6375 0.6612 0.6340
0.5381 38.0 380 1.1547 0.6188 0.6570 0.6122
0.5161 39.0 390 1.2268 0.5875 0.6161 0.5704
0.4528 40.0 400 1.1065 0.6188 0.6314 0.6122
0.401 41.0 410 1.1209 0.6438 0.6550 0.6440
0.4067 42.0 420 1.1440 0.6312 0.6345 0.6251
0.3831 43.0 430 1.1972 0.6188 0.6480 0.6075
0.4073 44.0 440 1.2422 0.6062 0.6644 0.6028
0.371 45.0 450 1.2152 0.5875 0.6087 0.5848
0.396 46.0 460 1.1972 0.6125 0.6306 0.6106
0.3322 47.0 470 1.2979 0.5813 0.6158 0.5811
0.3691 48.0 480 1.1657 0.625 0.6371 0.6162
0.3219 49.0 490 1.1786 0.6 0.6417 0.5997
0.3371 50.0 500 1.2126 0.6188 0.6396 0.6149
0.3781 51.0 510 1.2246 0.6 0.6244 0.5972
0.3629 52.0 520 1.1820 0.6188 0.6437 0.6122
0.3025 53.0 530 1.1795 0.6062 0.6326 0.6063
0.309 54.0 540 1.1647 0.625 0.6510 0.6252
0.2999 55.0 550 1.2023 0.6375 0.6449 0.6373

Framework versions

  • Transformers 4.33.0
  • Pytorch 2.0.0
  • Datasets 2.1.0
  • Tokenizers 0.13.3