paul commited on
Commit
2e3cc35
1 Parent(s): 81ba73a

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +211 -0
README.md ADDED
@@ -0,0 +1,211 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - imagefolder
7
+ metrics:
8
+ - accuracy
9
+ - precision
10
+ - recall
11
+ - f1
12
+ model-index:
13
+ - name: microsoft-resnet-50-cartoon-face-recognition
14
+ results:
15
+ - task:
16
+ name: Image Classification
17
+ type: image-classification
18
+ dataset:
19
+ name: imagefolder
20
+ type: imagefolder
21
+ config: default
22
+ split: train
23
+ args: default
24
+ metrics:
25
+ - name: Accuracy
26
+ type: accuracy
27
+ value: 0.7754629629629629
28
+ - name: Precision
29
+ type: precision
30
+ value: 0.7715337787579363
31
+ - name: Recall
32
+ type: recall
33
+ value: 0.7754629629629629
34
+ - name: F1
35
+ type: f1
36
+ value: 0.7676128880040883
37
+ ---
38
+
39
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
40
+ should probably proofread and complete it, then remove this comment. -->
41
+
42
+ # microsoft-resnet-50-cartoon-face-recognition
43
+
44
+ This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset.
45
+ It achieves the following results on the evaluation set:
46
+ - Loss: 0.8508
47
+ - Accuracy: 0.7755
48
+ - Precision: 0.7715
49
+ - Recall: 0.7755
50
+ - F1: 0.7676
51
+
52
+ ## Model description
53
+
54
+ More information needed
55
+
56
+ ## Intended uses & limitations
57
+
58
+ More information needed
59
+
60
+ ## Training and evaluation data
61
+
62
+ More information needed
63
+
64
+ ## Training procedure
65
+
66
+ ### Training hyperparameters
67
+
68
+ The following hyperparameters were used during training:
69
+ - learning_rate: 0.00012
70
+ - train_batch_size: 64
71
+ - eval_batch_size: 64
72
+ - seed: 42
73
+ - gradient_accumulation_steps: 4
74
+ - total_train_batch_size: 256
75
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
76
+ - lr_scheduler_type: linear
77
+ - lr_scheduler_warmup_ratio: 0.1
78
+ - num_epochs: 120
79
+
80
+ ### Training results
81
+
82
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
83
+ |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
84
+ | No log | 0.89 | 6 | 3.1774 | 0.0370 | 0.0069 | 0.0370 | 0.0098 |
85
+ | 3.4185 | 1.89 | 12 | 3.1739 | 0.0301 | 0.0100 | 0.0301 | 0.0126 |
86
+ | 3.4185 | 2.89 | 18 | 3.1668 | 0.0440 | 0.0805 | 0.0440 | 0.0340 |
87
+ | 3.6463 | 3.89 | 24 | 3.1583 | 0.0370 | 0.0180 | 0.0370 | 0.0151 |
88
+ | 3.3899 | 4.89 | 30 | 3.1425 | 0.0741 | 0.0610 | 0.0741 | 0.0453 |
89
+ | 3.3899 | 5.89 | 36 | 3.1262 | 0.0856 | 0.0334 | 0.0856 | 0.0405 |
90
+ | 3.5947 | 6.89 | 42 | 3.1055 | 0.1019 | 0.0784 | 0.1019 | 0.0481 |
91
+ | 3.5947 | 7.89 | 48 | 3.0841 | 0.1181 | 0.1071 | 0.1181 | 0.0500 |
92
+ | 3.5553 | 8.89 | 54 | 3.0650 | 0.1065 | 0.0216 | 0.1065 | 0.0343 |
93
+ | 3.2713 | 9.89 | 60 | 3.0351 | 0.1273 | 0.0323 | 0.1273 | 0.0418 |
94
+ | 3.2713 | 10.89 | 66 | 3.0069 | 0.1227 | 0.0311 | 0.1227 | 0.0390 |
95
+ | 3.4382 | 11.89 | 72 | 2.9754 | 0.1204 | 0.0353 | 0.1204 | 0.0366 |
96
+ | 3.4382 | 12.89 | 78 | 2.9455 | 0.1227 | 0.0224 | 0.1227 | 0.0338 |
97
+ | 3.3573 | 13.89 | 84 | 2.9167 | 0.1204 | 0.0213 | 0.1204 | 0.0332 |
98
+ | 3.0549 | 14.89 | 90 | 2.8841 | 0.1227 | 0.0474 | 0.1227 | 0.0408 |
99
+ | 3.0549 | 15.89 | 96 | 2.8534 | 0.1412 | 0.1174 | 0.1412 | 0.0540 |
100
+ | 3.1853 | 16.89 | 102 | 2.8143 | 0.1505 | 0.1595 | 0.1505 | 0.0667 |
101
+ | 3.1853 | 17.89 | 108 | 2.7771 | 0.1667 | 0.1693 | 0.1667 | 0.0719 |
102
+ | 3.0871 | 18.89 | 114 | 2.7400 | 0.1759 | 0.1454 | 0.1759 | 0.0896 |
103
+ | 2.7666 | 19.89 | 120 | 2.7048 | 0.2014 | 0.0927 | 0.2014 | 0.1051 |
104
+ | 2.7666 | 20.89 | 126 | 2.6458 | 0.2315 | 0.1622 | 0.2315 | 0.1250 |
105
+ | 2.846 | 21.89 | 132 | 2.5803 | 0.2569 | 0.2386 | 0.2569 | 0.1470 |
106
+ | 2.846 | 22.89 | 138 | 2.5291 | 0.2639 | 0.2725 | 0.2639 | 0.1523 |
107
+ | 2.7428 | 23.89 | 144 | 2.4916 | 0.2870 | 0.2114 | 0.2870 | 0.1811 |
108
+ | 2.4183 | 24.89 | 150 | 2.4273 | 0.3079 | 0.2322 | 0.3079 | 0.2048 |
109
+ | 2.4183 | 25.89 | 156 | 2.3923 | 0.3194 | 0.2937 | 0.3194 | 0.2238 |
110
+ | 2.5064 | 26.89 | 162 | 2.3349 | 0.3403 | 0.3183 | 0.3403 | 0.2494 |
111
+ | 2.5064 | 27.89 | 168 | 2.2977 | 0.3542 | 0.3554 | 0.3542 | 0.2663 |
112
+ | 2.4046 | 28.89 | 174 | 2.2363 | 0.3773 | 0.3214 | 0.3773 | 0.2981 |
113
+ | 2.1201 | 29.89 | 180 | 2.1791 | 0.3889 | 0.4024 | 0.3889 | 0.3179 |
114
+ | 2.1201 | 30.89 | 186 | 2.1448 | 0.4144 | 0.4079 | 0.4144 | 0.3455 |
115
+ | 2.1705 | 31.89 | 192 | 2.0969 | 0.4306 | 0.4214 | 0.4306 | 0.3583 |
116
+ | 2.1705 | 32.89 | 198 | 2.0535 | 0.4468 | 0.4448 | 0.4468 | 0.3797 |
117
+ | 2.0295 | 33.89 | 204 | 1.9940 | 0.4745 | 0.4877 | 0.4745 | 0.4133 |
118
+ | 1.8114 | 34.89 | 210 | 1.9467 | 0.4861 | 0.4952 | 0.4861 | 0.4261 |
119
+ | 1.8114 | 35.89 | 216 | 1.8896 | 0.4931 | 0.4510 | 0.4931 | 0.4321 |
120
+ | 1.8048 | 36.89 | 222 | 1.8404 | 0.5046 | 0.4859 | 0.5046 | 0.4507 |
121
+ | 1.8048 | 37.89 | 228 | 1.7999 | 0.5278 | 0.5142 | 0.5278 | 0.4816 |
122
+ | 1.6862 | 38.89 | 234 | 1.7363 | 0.5324 | 0.5169 | 0.5324 | 0.4844 |
123
+ | 1.4545 | 39.89 | 240 | 1.7104 | 0.5440 | 0.5100 | 0.5440 | 0.4971 |
124
+ | 1.4545 | 40.89 | 246 | 1.6492 | 0.5648 | 0.5239 | 0.5648 | 0.5138 |
125
+ | 1.4444 | 41.89 | 252 | 1.6076 | 0.5671 | 0.5329 | 0.5671 | 0.5260 |
126
+ | 1.4444 | 42.89 | 258 | 1.5784 | 0.5741 | 0.5708 | 0.5741 | 0.5424 |
127
+ | 1.3124 | 43.89 | 264 | 1.5259 | 0.6019 | 0.5977 | 0.6019 | 0.5619 |
128
+ | 1.1645 | 44.89 | 270 | 1.4814 | 0.6181 | 0.6033 | 0.6181 | 0.5880 |
129
+ | 1.1645 | 45.89 | 276 | 1.4697 | 0.6088 | 0.6033 | 0.6088 | 0.5803 |
130
+ | 1.1307 | 46.89 | 282 | 1.4380 | 0.6088 | 0.6015 | 0.6088 | 0.5769 |
131
+ | 1.1307 | 47.89 | 288 | 1.3872 | 0.6227 | 0.6085 | 0.6227 | 0.5917 |
132
+ | 1.0347 | 48.89 | 294 | 1.3709 | 0.6157 | 0.6039 | 0.6157 | 0.5880 |
133
+ | 0.8962 | 49.89 | 300 | 1.3415 | 0.6296 | 0.6120 | 0.6296 | 0.6057 |
134
+ | 0.8962 | 50.89 | 306 | 1.3290 | 0.6389 | 0.6327 | 0.6389 | 0.6134 |
135
+ | 0.8898 | 51.89 | 312 | 1.2836 | 0.6389 | 0.6192 | 0.6389 | 0.6119 |
136
+ | 0.8898 | 52.89 | 318 | 1.2665 | 0.6412 | 0.6186 | 0.6412 | 0.6162 |
137
+ | 0.7886 | 53.89 | 324 | 1.2272 | 0.6551 | 0.6431 | 0.6551 | 0.6319 |
138
+ | 0.6794 | 54.89 | 330 | 1.2144 | 0.6806 | 0.6643 | 0.6806 | 0.6629 |
139
+ | 0.6794 | 55.89 | 336 | 1.1817 | 0.6806 | 0.6666 | 0.6806 | 0.6642 |
140
+ | 0.6459 | 56.89 | 342 | 1.1702 | 0.6782 | 0.6591 | 0.6782 | 0.6574 |
141
+ | 0.6459 | 57.89 | 348 | 1.0947 | 0.7037 | 0.6863 | 0.7037 | 0.6883 |
142
+ | 0.6075 | 58.89 | 354 | 1.1227 | 0.7037 | 0.6874 | 0.7037 | 0.6867 |
143
+ | 0.4979 | 59.89 | 360 | 1.0849 | 0.7083 | 0.6813 | 0.7083 | 0.6895 |
144
+ | 0.4979 | 60.89 | 366 | 1.0742 | 0.7153 | 0.6924 | 0.7153 | 0.6976 |
145
+ | 0.4895 | 61.89 | 372 | 1.0452 | 0.7245 | 0.7020 | 0.7245 | 0.7057 |
146
+ | 0.4895 | 62.89 | 378 | 1.0435 | 0.7361 | 0.7316 | 0.7361 | 0.7235 |
147
+ | 0.456 | 63.89 | 384 | 1.0698 | 0.6921 | 0.6835 | 0.6921 | 0.6783 |
148
+ | 0.3816 | 64.89 | 390 | 1.0126 | 0.7222 | 0.7064 | 0.7222 | 0.7091 |
149
+ | 0.3816 | 65.89 | 396 | 0.9934 | 0.7361 | 0.7247 | 0.7361 | 0.7205 |
150
+ | 0.3599 | 66.89 | 402 | 0.9960 | 0.7292 | 0.7213 | 0.7292 | 0.7170 |
151
+ | 0.3599 | 67.89 | 408 | 1.0141 | 0.7222 | 0.7148 | 0.7222 | 0.7087 |
152
+ | 0.3484 | 68.89 | 414 | 0.9934 | 0.7222 | 0.7125 | 0.7222 | 0.7107 |
153
+ | 0.2939 | 69.89 | 420 | 0.9835 | 0.7431 | 0.7417 | 0.7431 | 0.7349 |
154
+ | 0.2939 | 70.89 | 426 | 0.9870 | 0.7315 | 0.7275 | 0.7315 | 0.7217 |
155
+ | 0.285 | 71.89 | 432 | 0.9656 | 0.7431 | 0.7411 | 0.7431 | 0.7340 |
156
+ | 0.285 | 72.89 | 438 | 0.9462 | 0.7338 | 0.7320 | 0.7338 | 0.7267 |
157
+ | 0.2463 | 73.89 | 444 | 0.9513 | 0.7454 | 0.7467 | 0.7454 | 0.7384 |
158
+ | 0.2328 | 74.89 | 450 | 0.9334 | 0.7361 | 0.7389 | 0.7361 | 0.7286 |
159
+ | 0.2328 | 75.89 | 456 | 0.9375 | 0.7384 | 0.7278 | 0.7384 | 0.7291 |
160
+ | 0.2208 | 76.89 | 462 | 0.9332 | 0.7407 | 0.7357 | 0.7407 | 0.7322 |
161
+ | 0.2208 | 77.89 | 468 | 0.9408 | 0.7384 | 0.7406 | 0.7384 | 0.7346 |
162
+ | 0.2177 | 78.89 | 474 | 0.9059 | 0.7222 | 0.7183 | 0.7222 | 0.7136 |
163
+ | 0.1734 | 79.89 | 480 | 0.9517 | 0.7315 | 0.7371 | 0.7315 | 0.7257 |
164
+ | 0.1734 | 80.89 | 486 | 0.9063 | 0.7523 | 0.7462 | 0.7523 | 0.7424 |
165
+ | 0.1791 | 81.89 | 492 | 0.9171 | 0.7454 | 0.7461 | 0.7454 | 0.7386 |
166
+ | 0.1791 | 82.89 | 498 | 0.8846 | 0.7523 | 0.7561 | 0.7523 | 0.7485 |
167
+ | 0.1681 | 83.89 | 504 | 0.8871 | 0.7384 | 0.7431 | 0.7384 | 0.7320 |
168
+ | 0.1573 | 84.89 | 510 | 0.9118 | 0.7454 | 0.7474 | 0.7454 | 0.7395 |
169
+ | 0.1573 | 85.89 | 516 | 0.9006 | 0.7407 | 0.7432 | 0.7407 | 0.7366 |
170
+ | 0.1439 | 86.89 | 522 | 0.8703 | 0.7616 | 0.7693 | 0.7616 | 0.7579 |
171
+ | 0.1439 | 87.89 | 528 | 0.8988 | 0.7454 | 0.7570 | 0.7454 | 0.7401 |
172
+ | 0.1362 | 88.89 | 534 | 0.9234 | 0.7454 | 0.7477 | 0.7454 | 0.7396 |
173
+ | 0.1249 | 89.89 | 540 | 0.8860 | 0.75 | 0.7473 | 0.75 | 0.7425 |
174
+ | 0.1249 | 90.89 | 546 | 0.8608 | 0.7546 | 0.7601 | 0.7546 | 0.7513 |
175
+ | 0.1264 | 91.89 | 552 | 0.8871 | 0.7593 | 0.7640 | 0.7593 | 0.7560 |
176
+ | 0.1264 | 92.89 | 558 | 0.8432 | 0.7639 | 0.7727 | 0.7639 | 0.7599 |
177
+ | 0.1201 | 93.89 | 564 | 0.8654 | 0.7639 | 0.7698 | 0.7639 | 0.7569 |
178
+ | 0.1117 | 94.89 | 570 | 0.8856 | 0.7454 | 0.7569 | 0.7454 | 0.7415 |
179
+ | 0.1117 | 95.89 | 576 | 0.8668 | 0.7546 | 0.7686 | 0.7546 | 0.7535 |
180
+ | 0.1128 | 96.89 | 582 | 0.8630 | 0.7662 | 0.7698 | 0.7662 | 0.7619 |
181
+ | 0.1128 | 97.89 | 588 | 0.8551 | 0.7731 | 0.7826 | 0.7731 | 0.7696 |
182
+ | 0.1155 | 98.89 | 594 | 0.8697 | 0.7708 | 0.7738 | 0.7708 | 0.7643 |
183
+ | 0.0987 | 99.89 | 600 | 0.8613 | 0.7546 | 0.7518 | 0.7546 | 0.7484 |
184
+ | 0.0987 | 100.89 | 606 | 0.8742 | 0.7569 | 0.7597 | 0.7569 | 0.7524 |
185
+ | 0.1063 | 101.89 | 612 | 0.8498 | 0.7755 | 0.7807 | 0.7755 | 0.7712 |
186
+ | 0.1063 | 102.89 | 618 | 0.8557 | 0.7708 | 0.7749 | 0.7708 | 0.7655 |
187
+ | 0.097 | 103.89 | 624 | 0.8764 | 0.7546 | 0.7634 | 0.7546 | 0.7527 |
188
+ | 0.0947 | 104.89 | 630 | 0.8677 | 0.7616 | 0.7628 | 0.7616 | 0.7572 |
189
+ | 0.0947 | 105.89 | 636 | 0.8909 | 0.75 | 0.7614 | 0.75 | 0.7469 |
190
+ | 0.1013 | 106.89 | 642 | 0.8283 | 0.7639 | 0.7621 | 0.7639 | 0.7580 |
191
+ | 0.1013 | 107.89 | 648 | 0.8471 | 0.7662 | 0.7864 | 0.7662 | 0.7651 |
192
+ | 0.0963 | 108.89 | 654 | 0.8653 | 0.7593 | 0.7701 | 0.7593 | 0.7558 |
193
+ | 0.0874 | 109.89 | 660 | 0.8479 | 0.7731 | 0.7834 | 0.7731 | 0.7692 |
194
+ | 0.0874 | 110.89 | 666 | 0.8584 | 0.7639 | 0.7719 | 0.7639 | 0.7620 |
195
+ | 0.0876 | 111.89 | 672 | 0.8714 | 0.7616 | 0.7600 | 0.7616 | 0.7550 |
196
+ | 0.0876 | 112.89 | 678 | 0.8509 | 0.7731 | 0.7847 | 0.7731 | 0.7727 |
197
+ | 0.0974 | 113.89 | 684 | 0.8688 | 0.7685 | 0.7741 | 0.7685 | 0.7648 |
198
+ | 0.0869 | 114.89 | 690 | 0.8590 | 0.7847 | 0.7932 | 0.7847 | 0.7794 |
199
+ | 0.0869 | 115.89 | 696 | 0.8687 | 0.7593 | 0.7703 | 0.7593 | 0.7579 |
200
+ | 0.0877 | 116.89 | 702 | 0.8735 | 0.7593 | 0.7698 | 0.7593 | 0.7554 |
201
+ | 0.0877 | 117.89 | 708 | 0.8566 | 0.7546 | 0.7732 | 0.7546 | 0.7518 |
202
+ | 0.0883 | 118.89 | 714 | 0.8681 | 0.7569 | 0.7591 | 0.7569 | 0.7525 |
203
+ | 0.0762 | 119.89 | 720 | 0.8508 | 0.7755 | 0.7715 | 0.7755 | 0.7676 |
204
+
205
+
206
+ ### Framework versions
207
+
208
+ - Transformers 4.25.1
209
+ - Pytorch 1.13.1+cu117
210
+ - Datasets 2.8.0
211
+ - Tokenizers 0.11.0