dennisjooo's picture
End of training
d62374f
|
raw
history blame
10.6 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - image_folder
metrics:
  - accuracy
  - precision
  - f1
model-index:
  - name: emotion_classification
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: image_folder
          type: image_folder
          config: FastJobs--Visual_Emotional_Analysis
          split: train
          args: FastJobs--Visual_Emotional_Analysis
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.6625
          - name: Precision
            type: precision
            value: 0.6857332900074835
          - name: F1
            type: f1
            value: 0.6658368805611075

emotion_classification

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the image_folder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1720
  • Accuracy: 0.6625
  • Precision: 0.6857
  • F1: 0.6658

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_steps: 150
  • num_epochs: 300

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision F1
2.0805 1.0 10 2.0844 0.1688 0.1731 0.1670
2.0876 2.0 20 2.0807 0.1938 0.1814 0.1843
2.0786 3.0 30 2.0741 0.1812 0.1658 0.1702
2.0653 4.0 40 2.0663 0.2062 0.1832 0.1893
2.0586 5.0 50 2.0547 0.2062 0.1817 0.1911
2.0347 6.0 60 2.0343 0.2375 0.2074 0.2187
2.009 7.0 70 2.0039 0.2875 0.4007 0.2750
1.9672 8.0 80 1.9560 0.3187 0.3615 0.3006
1.9015 9.0 90 1.8650 0.3688 0.4229 0.3577
1.812 10.0 100 1.7339 0.4375 0.3925 0.4045
1.6993 11.0 110 1.6196 0.4688 0.4093 0.4267
1.6037 12.0 120 1.5466 0.475 0.4808 0.4413
1.5332 13.0 130 1.4791 0.525 0.4749 0.4922
1.4649 14.0 140 1.4201 0.525 0.4860 0.4948
1.4142 15.0 150 1.3659 0.55 0.5231 0.5178
1.3826 16.0 160 1.3001 0.575 0.5346 0.5434
1.3048 17.0 170 1.2689 0.5813 0.5381 0.5535
1.2519 18.0 180 1.2334 0.575 0.5816 0.5580
1.2043 19.0 190 1.2186 0.55 0.5739 0.5424
1.1575 20.0 200 1.1711 0.5687 0.5421 0.5371
1.0957 21.0 210 1.1674 0.5938 0.5764 0.5645
1.0719 22.0 220 1.1473 0.5875 0.5899 0.5721
0.9894 23.0 230 1.1303 0.6125 0.6507 0.6124
0.9698 24.0 240 1.1010 0.6188 0.6323 0.6142
0.9081 25.0 250 1.1038 0.5938 0.6074 0.5923
0.8739 26.0 260 1.1383 0.5563 0.5874 0.5447
0.8815 27.0 270 1.1483 0.6 0.6524 0.5894
0.8426 28.0 280 1.1212 0.5813 0.6356 0.5703
0.7614 29.0 290 1.1002 0.6188 0.6724 0.6089
0.7937 30.0 300 1.0272 0.6188 0.6515 0.6135
0.7379 31.0 310 1.0184 0.6062 0.6120 0.6035
0.6994 32.0 320 1.0163 0.5875 0.5966 0.5772
0.684 33.0 330 1.0420 0.6312 0.6627 0.6327
0.605 34.0 340 1.0555 0.6312 0.6822 0.6353
0.5851 35.0 350 1.0991 0.625 0.6941 0.6269
0.6186 36.0 360 1.1196 0.6188 0.6916 0.6077
0.5349 37.0 370 1.0707 0.6062 0.6123 0.5978
0.5549 38.0 380 1.0161 0.6375 0.6498 0.6308
0.577 39.0 390 1.1375 0.5813 0.6449 0.5770
0.5151 40.0 400 1.0479 0.65 0.6691 0.6421
0.4898 41.0 410 1.0835 0.6125 0.6378 0.6106
0.4619 42.0 420 1.0262 0.6375 0.6596 0.6418
0.4142 43.0 430 1.1238 0.6188 0.6422 0.6143
0.4695 44.0 440 1.0765 0.65 0.6664 0.6424
0.4195 45.0 450 1.0646 0.6375 0.6622 0.6357
0.4144 46.0 460 1.1255 0.6 0.6308 0.6023
0.3552 47.0 470 1.0580 0.6562 0.6639 0.6574
0.3887 48.0 480 1.0673 0.6438 0.6560 0.6421
0.348 49.0 490 1.1828 0.6062 0.6503 0.6041
0.3284 50.0 500 1.1613 0.5625 0.5756 0.5585
0.4082 51.0 510 1.1582 0.6188 0.6458 0.6154
0.3929 52.0 520 1.1444 0.6188 0.6438 0.6117
0.337 53.0 530 1.1073 0.6375 0.6497 0.6348
0.3525 54.0 540 1.1750 0.6062 0.6331 0.6079
0.3336 55.0 550 1.1841 0.6188 0.6435 0.6116
0.2946 56.0 560 1.2258 0.5875 0.6250 0.5820
0.332 57.0 570 1.1952 0.5938 0.6526 0.6018
0.3013 58.0 580 1.1858 0.6438 0.6671 0.6465
0.3035 59.0 590 1.1823 0.625 0.6326 0.6238
0.3071 60.0 600 1.1567 0.6062 0.6348 0.6035
0.2783 61.0 610 1.1536 0.6188 0.6360 0.6178
0.2901 62.0 620 1.1183 0.6312 0.6412 0.6300
0.3046 63.0 630 1.1705 0.6 0.6209 0.6026
0.3066 64.0 640 1.1717 0.6375 0.6501 0.6328
0.2978 65.0 650 1.1669 0.6375 0.6539 0.6332
0.2967 66.0 660 1.2839 0.6188 0.6552 0.6097
0.3624 67.0 670 1.2095 0.625 0.6622 0.6170
0.2683 68.0 680 1.2292 0.6125 0.6504 0.6159
0.2862 69.0 690 1.2228 0.6125 0.6252 0.6061
0.252 70.0 700 1.4087 0.575 0.6327 0.5738
0.2968 71.0 710 1.1559 0.6562 0.6769 0.6585
0.247 72.0 720 1.1829 0.6062 0.6333 0.6108
0.2849 73.0 730 1.2207 0.6312 0.6863 0.6321
0.2684 74.0 740 1.1720 0.6625 0.6857 0.6658
0.2649 75.0 750 1.2352 0.6375 0.6479 0.6359
0.2265 76.0 760 1.2990 0.6 0.6427 0.6002
0.2398 77.0 770 1.3163 0.6 0.6420 0.6007
0.2324 78.0 780 1.3362 0.5938 0.5907 0.5730
0.1927 79.0 790 1.2690 0.625 0.6552 0.6227
0.1757 80.0 800 1.2791 0.65 0.6716 0.6487
0.1993 81.0 810 1.2946 0.625 0.6564 0.6235
0.2326 82.0 820 1.3964 0.5813 0.6042 0.5742
0.2252 83.0 830 1.3020 0.6125 0.6567 0.6095
0.228 84.0 840 1.2979 0.6312 0.6629 0.6358
0.2055 85.0 850 1.2876 0.6125 0.6274 0.6086
0.2171 86.0 860 1.2951 0.6312 0.6574 0.6308
0.2156 87.0 870 1.3025 0.6 0.6072 0.5975
0.1869 88.0 880 1.2232 0.6375 0.6822 0.6423
0.2199 89.0 890 1.2538 0.6125 0.6056 0.6009
0.189 90.0 900 1.3159 0.6188 0.6345 0.6198
0.2023 91.0 910 1.3270 0.5938 0.6124 0.5910
0.2304 92.0 920 1.2732 0.65 0.6642 0.6436
0.2042 93.0 930 1.4199 0.55 0.5662 0.5401
0.1968 94.0 940 1.4262 0.5875 0.6388 0.5828
0.1968 95.0 950 1.3575 0.6062 0.6364 0.6090
0.2176 96.0 960 1.3166 0.6062 0.6375 0.6080
0.1884 97.0 970 1.2959 0.5875 0.6066 0.5876
0.1841 98.0 980 1.4839 0.5875 0.6712 0.5838
0.2175 99.0 990 1.3247 0.6125 0.6385 0.6086
0.2091 100.0 1000 1.3601 0.6188 0.6490 0.6138
0.1656 101.0 1010 1.4244 0.6062 0.6495 0.6077
0.1897 102.0 1020 1.3256 0.6188 0.6774 0.6237
0.1816 103.0 1030 1.3440 0.6062 0.6390 0.6097
0.1973 104.0 1040 1.3377 0.625 0.6645 0.6240

Framework versions

  • Transformers 4.33.0
  • Pytorch 2.0.0
  • Datasets 2.1.0
  • Tokenizers 0.13.3