dennisjooo commited on
Commit
aa91e80
1 Parent(s): 733a9b8

End of training

Browse files
Files changed (3) hide show
  1. README.md +75 -67
  2. pytorch_model.bin +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -24,13 +24,13 @@ model-index:
24
  metrics:
25
  - name: Accuracy
26
  type: accuracy
27
- value: 0.65625
28
  - name: Precision
29
  type: precision
30
- value: 0.6864745278875714
31
  - name: F1
32
  type: f1
33
- value: 0.6531282051282051
34
  ---
35
 
36
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -40,10 +40,10 @@ should probably proofread and complete it, then remove this comment. -->
40
 
41
  This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the image_folder dataset.
42
  It achieves the following results on the evaluation set:
43
- - Loss: 1.0743
44
- - Accuracy: 0.6562
45
- - Precision: 0.6865
46
- - F1: 0.6531
47
 
48
  ## Model description
49
 
@@ -68,72 +68,80 @@ The following hyperparameters were used during training:
68
  - seed: 42
69
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
70
  - lr_scheduler_type: cosine_with_restarts
71
- - lr_scheduler_warmup_steps: 100
72
  - num_epochs: 300
73
 
74
  ### Training results
75
 
76
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | F1 |
77
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|
78
- | 2.0912 | 1.0 | 10 | 2.0884 | 0.0938 | 0.0557 | 0.0679 |
79
- | 2.086 | 2.0 | 20 | 2.0835 | 0.1062 | 0.1076 | 0.0825 |
80
- | 2.0724 | 3.0 | 30 | 2.0743 | 0.15 | 0.1595 | 0.1235 |
81
- | 2.0575 | 4.0 | 40 | 2.0614 | 0.1625 | 0.1451 | 0.1291 |
82
- | 2.0375 | 5.0 | 50 | 2.0399 | 0.2125 | 0.2375 | 0.1880 |
83
- | 1.9952 | 6.0 | 60 | 1.9954 | 0.2875 | 0.4219 | 0.2692 |
84
- | 1.9309 | 7.0 | 70 | 1.9096 | 0.3312 | 0.4116 | 0.3141 |
85
- | 1.8219 | 8.0 | 80 | 1.7690 | 0.375 | 0.4091 | 0.3375 |
86
- | 1.6907 | 9.0 | 90 | 1.6323 | 0.4 | 0.4548 | 0.3595 |
87
- | 1.5937 | 10.0 | 100 | 1.5317 | 0.4437 | 0.4015 | 0.4174 |
88
- | 1.5157 | 11.0 | 110 | 1.4620 | 0.5312 | 0.4945 | 0.5078 |
89
- | 1.4458 | 12.0 | 120 | 1.4050 | 0.5125 | 0.4734 | 0.4880 |
90
- | 1.3712 | 13.0 | 130 | 1.3719 | 0.5375 | 0.5776 | 0.5236 |
91
- | 1.3043 | 14.0 | 140 | 1.3033 | 0.5687 | 0.6482 | 0.5547 |
92
- | 1.2424 | 15.0 | 150 | 1.2497 | 0.5813 | 0.5970 | 0.5619 |
93
- | 1.2369 | 16.0 | 160 | 1.2423 | 0.5375 | 0.4994 | 0.5061 |
94
- | 1.1596 | 17.0 | 170 | 1.2109 | 0.5563 | 0.5086 | 0.5216 |
95
- | 1.1252 | 18.0 | 180 | 1.1889 | 0.5813 | 0.5772 | 0.5622 |
96
- | 1.0746 | 19.0 | 190 | 1.1752 | 0.5625 | 0.5843 | 0.5631 |
97
- | 1.0496 | 20.0 | 200 | 1.1402 | 0.6062 | 0.5995 | 0.5911 |
98
- | 0.9874 | 21.0 | 210 | 1.1470 | 0.5875 | 0.5897 | 0.5720 |
99
- | 0.9423 | 22.0 | 220 | 1.1294 | 0.6188 | 0.6174 | 0.6072 |
100
- | 0.8842 | 23.0 | 230 | 1.1335 | 0.6 | 0.6216 | 0.6004 |
101
- | 0.8817 | 24.0 | 240 | 1.1002 | 0.6 | 0.6078 | 0.5970 |
102
- | 0.8365 | 25.0 | 250 | 1.1237 | 0.625 | 0.6392 | 0.6209 |
103
- | 0.7965 | 26.0 | 260 | 1.1781 | 0.55 | 0.5888 | 0.5419 |
104
- | 0.7829 | 27.0 | 270 | 1.1278 | 0.6 | 0.6219 | 0.5947 |
105
- | 0.7269 | 28.0 | 280 | 1.1144 | 0.6 | 0.6386 | 0.5937 |
106
- | 0.7158 | 29.0 | 290 | 1.1245 | 0.6125 | 0.6524 | 0.5939 |
107
- | 0.7178 | 30.0 | 300 | 1.0692 | 0.6188 | 0.6344 | 0.6159 |
108
- | 0.6704 | 31.0 | 310 | 1.0568 | 0.65 | 0.6724 | 0.6514 |
109
- | 0.6371 | 32.0 | 320 | 1.0411 | 0.65 | 0.6529 | 0.6465 |
110
- | 0.6317 | 33.0 | 330 | 1.1018 | 0.6438 | 0.6732 | 0.6416 |
111
- | 0.5625 | 34.0 | 340 | 1.0743 | 0.6562 | 0.6865 | 0.6531 |
112
- | 0.5717 | 35.0 | 350 | 1.1658 | 0.6062 | 0.6636 | 0.6094 |
113
- | 0.5807 | 36.0 | 360 | 1.1473 | 0.625 | 0.6654 | 0.6161 |
114
- | 0.5269 | 37.0 | 370 | 1.1367 | 0.6188 | 0.6317 | 0.6150 |
115
- | 0.5284 | 38.0 | 380 | 1.0724 | 0.6438 | 0.6625 | 0.6449 |
116
- | 0.5715 | 39.0 | 390 | 1.1805 | 0.575 | 0.6076 | 0.5711 |
117
- | 0.486 | 40.0 | 400 | 1.1676 | 0.5938 | 0.6379 | 0.5892 |
118
- | 0.4581 | 41.0 | 410 | 1.1633 | 0.6312 | 0.6583 | 0.6298 |
119
- | 0.4364 | 42.0 | 420 | 1.1371 | 0.6312 | 0.6353 | 0.6255 |
120
- | 0.4117 | 43.0 | 430 | 1.2004 | 0.625 | 0.6748 | 0.6086 |
121
- | 0.4433 | 44.0 | 440 | 1.1082 | 0.625 | 0.6322 | 0.6232 |
122
- | 0.4031 | 45.0 | 450 | 1.2251 | 0.5875 | 0.6395 | 0.5944 |
123
- | 0.4205 | 46.0 | 460 | 1.2513 | 0.5938 | 0.6196 | 0.5934 |
124
- | 0.3524 | 47.0 | 470 | 1.1704 | 0.6125 | 0.6303 | 0.6147 |
125
- | 0.4094 | 48.0 | 480 | 1.1930 | 0.5875 | 0.6071 | 0.5892 |
126
- | 0.369 | 49.0 | 490 | 1.1970 | 0.6188 | 0.6509 | 0.6171 |
127
- | 0.3666 | 50.0 | 500 | 1.2280 | 0.6 | 0.6171 | 0.5971 |
128
- | 0.4054 | 51.0 | 510 | 1.2725 | 0.5625 | 0.5807 | 0.5599 |
129
- | 0.4247 | 52.0 | 520 | 1.2380 | 0.6188 | 0.6385 | 0.6128 |
130
- | 0.3791 | 53.0 | 530 | 1.2402 | 0.5813 | 0.6153 | 0.5806 |
131
- | 0.3241 | 54.0 | 540 | 1.2491 | 0.5687 | 0.5817 | 0.5676 |
132
- | 0.3268 | 55.0 | 550 | 1.2575 | 0.5938 | 0.6058 | 0.5956 |
133
- | 0.3419 | 56.0 | 560 | 1.3199 | 0.6 | 0.6160 | 0.5930 |
134
- | 0.3657 | 57.0 | 570 | 1.2408 | 0.6188 | 0.6441 | 0.6207 |
135
- | 0.3327 | 58.0 | 580 | 1.2430 | 0.6125 | 0.6200 | 0.6107 |
136
- | 0.3126 | 59.0 | 590 | 1.3995 | 0.5312 | 0.5619 | 0.5158 |
 
 
 
 
 
 
 
 
137
 
138
 
139
  ### Framework versions
 
24
  metrics:
25
  - name: Accuracy
26
  type: accuracy
27
+ value: 0.675
28
  - name: Precision
29
  type: precision
30
+ value: 0.6854354001733034
31
  - name: F1
32
  type: f1
33
+ value: 0.6750572520063745
34
  ---
35
 
36
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
40
 
41
  This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the image_folder dataset.
42
  It achieves the following results on the evaluation set:
43
+ - Loss: 1.0683
44
+ - Accuracy: 0.675
45
+ - Precision: 0.6854
46
+ - F1: 0.6751
47
 
48
  ## Model description
49
 
 
68
  - seed: 42
69
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
70
  - lr_scheduler_type: cosine_with_restarts
71
+ - lr_scheduler_warmup_steps: 150
72
  - num_epochs: 300
73
 
74
  ### Training results
75
 
76
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | F1 |
77
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|
78
+ | 2.0804 | 1.0 | 10 | 2.0881 | 0.1437 | 0.2313 | 0.1165 |
79
+ | 2.0839 | 2.0 | 20 | 2.0846 | 0.1562 | 0.1772 | 0.1250 |
80
+ | 2.072 | 3.0 | 30 | 2.0786 | 0.1562 | 0.1835 | 0.1251 |
81
+ | 2.0676 | 4.0 | 40 | 2.0702 | 0.1562 | 0.2213 | 0.1265 |
82
+ | 2.053 | 5.0 | 50 | 2.0586 | 0.1625 | 0.2289 | 0.1330 |
83
+ | 2.0346 | 6.0 | 60 | 2.0390 | 0.1938 | 0.3508 | 0.1830 |
84
+ | 2.0072 | 7.0 | 70 | 2.0080 | 0.2437 | 0.3131 | 0.2285 |
85
+ | 1.9672 | 8.0 | 80 | 1.9506 | 0.325 | 0.3516 | 0.3209 |
86
+ | 1.8907 | 9.0 | 90 | 1.8587 | 0.3438 | 0.4010 | 0.3361 |
87
+ | 1.7841 | 10.0 | 100 | 1.7300 | 0.3937 | 0.4617 | 0.3860 |
88
+ | 1.6688 | 11.0 | 110 | 1.6084 | 0.4625 | 0.4958 | 0.4402 |
89
+ | 1.5803 | 12.0 | 120 | 1.5305 | 0.4875 | 0.5327 | 0.4661 |
90
+ | 1.5069 | 13.0 | 130 | 1.4577 | 0.5437 | 0.5171 | 0.5126 |
91
+ | 1.4353 | 14.0 | 140 | 1.3955 | 0.55 | 0.6004 | 0.5380 |
92
+ | 1.3913 | 15.0 | 150 | 1.3353 | 0.5437 | 0.6508 | 0.4995 |
93
+ | 1.3551 | 16.0 | 160 | 1.2874 | 0.5563 | 0.5251 | 0.5201 |
94
+ | 1.2889 | 17.0 | 170 | 1.2618 | 0.5687 | 0.5829 | 0.5475 |
95
+ | 1.2387 | 18.0 | 180 | 1.2455 | 0.5687 | 0.5723 | 0.5587 |
96
+ | 1.1977 | 19.0 | 190 | 1.2210 | 0.5875 | 0.6221 | 0.5858 |
97
+ | 1.1447 | 20.0 | 200 | 1.1909 | 0.6 | 0.6153 | 0.5840 |
98
+ | 1.0959 | 21.0 | 210 | 1.1918 | 0.5813 | 0.5896 | 0.5609 |
99
+ | 1.0657 | 22.0 | 220 | 1.1343 | 0.625 | 0.6352 | 0.6184 |
100
+ | 0.9869 | 23.0 | 230 | 1.1309 | 0.625 | 0.6549 | 0.6258 |
101
+ | 0.9576 | 24.0 | 240 | 1.1071 | 0.6312 | 0.6373 | 0.6280 |
102
+ | 0.9234 | 25.0 | 250 | 1.1407 | 0.6312 | 0.6469 | 0.6279 |
103
+ | 0.876 | 26.0 | 260 | 1.2006 | 0.5625 | 0.6040 | 0.5514 |
104
+ | 0.8969 | 27.0 | 270 | 1.1007 | 0.6125 | 0.6290 | 0.6121 |
105
+ | 0.8066 | 28.0 | 280 | 1.1208 | 0.6 | 0.6650 | 0.5971 |
106
+ | 0.7579 | 29.0 | 290 | 1.1328 | 0.6125 | 0.6625 | 0.6035 |
107
+ | 0.7581 | 30.0 | 300 | 1.1039 | 0.6125 | 0.6401 | 0.6121 |
108
+ | 0.7164 | 31.0 | 310 | 1.0862 | 0.65 | 0.6723 | 0.6494 |
109
+ | 0.7075 | 32.0 | 320 | 1.0575 | 0.65 | 0.6683 | 0.6485 |
110
+ | 0.6655 | 33.0 | 330 | 1.1186 | 0.6125 | 0.6483 | 0.6134 |
111
+ | 0.5947 | 34.0 | 340 | 1.1133 | 0.625 | 0.6439 | 0.6272 |
112
+ | 0.5813 | 35.0 | 350 | 1.1071 | 0.6312 | 0.6735 | 0.6337 |
113
+ | 0.6322 | 36.0 | 360 | 1.0839 | 0.6312 | 0.6591 | 0.6324 |
114
+ | 0.561 | 37.0 | 370 | 1.1040 | 0.625 | 0.6425 | 0.6220 |
115
+ | 0.558 | 38.0 | 380 | 1.0727 | 0.6125 | 0.6255 | 0.6112 |
116
+ | 0.5372 | 39.0 | 390 | 1.1417 | 0.6312 | 0.6545 | 0.6292 |
117
+ | 0.5146 | 40.0 | 400 | 1.0967 | 0.6312 | 0.6645 | 0.6285 |
118
+ | 0.4968 | 41.0 | 410 | 1.1187 | 0.6312 | 0.6543 | 0.6316 |
119
+ | 0.4593 | 42.0 | 420 | 1.0683 | 0.675 | 0.6854 | 0.6751 |
120
+ | 0.4392 | 43.0 | 430 | 1.0937 | 0.6375 | 0.6481 | 0.6374 |
121
+ | 0.4503 | 44.0 | 440 | 1.1320 | 0.625 | 0.6536 | 0.6255 |
122
+ | 0.3918 | 45.0 | 450 | 1.1218 | 0.6312 | 0.6464 | 0.6312 |
123
+ | 0.4236 | 46.0 | 460 | 1.2074 | 0.5938 | 0.6188 | 0.5911 |
124
+ | 0.3858 | 47.0 | 470 | 1.1769 | 0.5813 | 0.6106 | 0.5809 |
125
+ | 0.392 | 48.0 | 480 | 1.1572 | 0.625 | 0.6381 | 0.6216 |
126
+ | 0.3708 | 49.0 | 490 | 1.2293 | 0.6 | 0.6388 | 0.5953 |
127
+ | 0.3346 | 50.0 | 500 | 1.2205 | 0.5938 | 0.6188 | 0.5943 |
128
+ | 0.3831 | 51.0 | 510 | 1.2875 | 0.5875 | 0.5982 | 0.5845 |
129
+ | 0.4161 | 52.0 | 520 | 1.2355 | 0.5938 | 0.6421 | 0.5799 |
130
+ | 0.3736 | 53.0 | 530 | 1.2361 | 0.6062 | 0.6301 | 0.6006 |
131
+ | 0.3278 | 54.0 | 540 | 1.1670 | 0.6312 | 0.6520 | 0.6286 |
132
+ | 0.3295 | 55.0 | 550 | 1.1807 | 0.6438 | 0.6712 | 0.6457 |
133
+ | 0.3357 | 56.0 | 560 | 1.2007 | 0.625 | 0.6279 | 0.6239 |
134
+ | 0.3169 | 57.0 | 570 | 1.2314 | 0.5938 | 0.6257 | 0.5942 |
135
+ | 0.3193 | 58.0 | 580 | 1.2068 | 0.6188 | 0.6397 | 0.6208 |
136
+ | 0.3128 | 59.0 | 590 | 1.2753 | 0.5875 | 0.5919 | 0.5760 |
137
+ | 0.3077 | 60.0 | 600 | 1.2154 | 0.625 | 0.6432 | 0.6238 |
138
+ | 0.2751 | 61.0 | 610 | 1.2596 | 0.6125 | 0.6216 | 0.6099 |
139
+ | 0.2921 | 62.0 | 620 | 1.2716 | 0.6188 | 0.6467 | 0.6189 |
140
+ | 0.2939 | 63.0 | 630 | 1.2213 | 0.625 | 0.6350 | 0.6264 |
141
+ | 0.2732 | 64.0 | 640 | 1.3456 | 0.5938 | 0.6189 | 0.5897 |
142
+ | 0.2806 | 65.0 | 650 | 1.2491 | 0.6188 | 0.6393 | 0.6162 |
143
+ | 0.2453 | 66.0 | 660 | 1.2312 | 0.6188 | 0.6465 | 0.6195 |
144
+ | 0.3077 | 67.0 | 670 | 1.2356 | 0.6375 | 0.6564 | 0.6373 |
145
 
146
 
147
  ### Framework versions
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:37dab50575d76abc1520a3c50dadc51f50e162a73895cc3facafb761520c1d20
3
  size 343287149
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d47386f85e768a6b9efa1425abcae2b77949da9199600fa03f685fc80754d8d2
3
  size 343287149
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:22f458e8f782216a14f03acb5645f345ca24327dc29618f126fbecd5d9269665
3
  size 4027
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7d0a0ca31590e1bb9305f81f29252aace408bdc25efad7fc8ac010a9565b7cf2
3
  size 4027