resnet101-base_tobacco
This model is a fine-tuned version of microsoft/resnet-101 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.6332
- Accuracy: 0.435
- Brier Loss: 0.6886
- Nll: 4.4967
- F1 Micro: 0.435
- F1 Macro: 0.2876
- Ece: 0.2482
- Aurc: 0.3432
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 13 | 2.3065 | 0.06 | 0.9008 | 7.5257 | 0.06 | 0.0563 | 0.1444 | 0.9505 |
No log | 2.0 | 26 | 2.3098 | 0.075 | 0.9014 | 8.6176 | 0.075 | 0.0468 | 0.1535 | 0.9485 |
No log | 3.0 | 39 | 2.3082 | 0.09 | 0.9011 | 7.8490 | 0.09 | 0.0647 | 0.1662 | 0.9336 |
No log | 4.0 | 52 | 2.3056 | 0.12 | 0.9006 | 7.6932 | 0.12 | 0.0809 | 0.1814 | 0.8887 |
No log | 5.0 | 65 | 2.3004 | 0.125 | 0.8995 | 7.1356 | 0.125 | 0.0750 | 0.1841 | 0.8198 |
No log | 6.0 | 78 | 2.2921 | 0.155 | 0.8979 | 5.9637 | 0.155 | 0.0706 | 0.2036 | 0.7930 |
No log | 7.0 | 91 | 2.2917 | 0.165 | 0.8978 | 5.7926 | 0.165 | 0.0785 | 0.2139 | 0.8056 |
No log | 8.0 | 104 | 2.2842 | 0.185 | 0.8963 | 4.7947 | 0.185 | 0.0595 | 0.2244 | 0.8344 |
No log | 9.0 | 117 | 2.2742 | 0.215 | 0.8942 | 4.4573 | 0.2150 | 0.0830 | 0.2424 | 0.7961 |
No log | 10.0 | 130 | 2.2638 | 0.2 | 0.8921 | 4.8564 | 0.2000 | 0.0554 | 0.2376 | 0.7663 |
No log | 11.0 | 143 | 2.2530 | 0.215 | 0.8898 | 5.0772 | 0.2150 | 0.0740 | 0.2467 | 0.7908 |
No log | 12.0 | 156 | 2.2479 | 0.19 | 0.8888 | 5.3276 | 0.19 | 0.0421 | 0.2220 | 0.7856 |
No log | 13.0 | 169 | 2.2406 | 0.18 | 0.8873 | 5.2973 | 0.18 | 0.0308 | 0.2248 | 0.8007 |
No log | 14.0 | 182 | 2.2202 | 0.285 | 0.8826 | 5.4657 | 0.285 | 0.1167 | 0.2855 | 0.6743 |
No log | 15.0 | 195 | 2.2085 | 0.29 | 0.8801 | 5.7797 | 0.29 | 0.1154 | 0.2909 | 0.6660 |
No log | 16.0 | 208 | 2.1850 | 0.305 | 0.8742 | 5.7600 | 0.305 | 0.1194 | 0.3063 | 0.4897 |
No log | 17.0 | 221 | 2.2017 | 0.18 | 0.8789 | 5.7405 | 0.18 | 0.0306 | 0.2309 | 0.7654 |
No log | 18.0 | 234 | 2.1998 | 0.18 | 0.8784 | 5.8985 | 0.18 | 0.0305 | 0.2377 | 0.7525 |
No log | 19.0 | 247 | 2.1429 | 0.285 | 0.8640 | 5.9614 | 0.285 | 0.1117 | 0.2970 | 0.5007 |
No log | 20.0 | 260 | 2.1240 | 0.315 | 0.8587 | 5.9916 | 0.315 | 0.1232 | 0.3057 | 0.4288 |
No log | 21.0 | 273 | 2.0986 | 0.305 | 0.8513 | 5.9764 | 0.305 | 0.1166 | 0.3001 | 0.4526 |
No log | 22.0 | 286 | 2.0909 | 0.315 | 0.8494 | 5.9914 | 0.315 | 0.1234 | 0.3062 | 0.4385 |
No log | 23.0 | 299 | 2.0451 | 0.295 | 0.8313 | 6.1078 | 0.295 | 0.1115 | 0.2901 | 0.4619 |
No log | 24.0 | 312 | 2.0662 | 0.3 | 0.8413 | 6.1029 | 0.3 | 0.1168 | 0.3014 | 0.4544 |
No log | 25.0 | 325 | 2.0235 | 0.3 | 0.8238 | 6.1798 | 0.3 | 0.1156 | 0.2885 | 0.4553 |
No log | 26.0 | 338 | 2.0669 | 0.305 | 0.8439 | 6.2056 | 0.305 | 0.1207 | 0.3046 | 0.4579 |
No log | 27.0 | 351 | 2.0223 | 0.315 | 0.8256 | 6.1083 | 0.315 | 0.1232 | 0.2860 | 0.4308 |
No log | 28.0 | 364 | 2.1075 | 0.185 | 0.8574 | 6.0867 | 0.185 | 0.0370 | 0.2317 | 0.7416 |
No log | 29.0 | 377 | 1.9127 | 0.295 | 0.7709 | 6.1567 | 0.295 | 0.1155 | 0.2464 | 0.4630 |
No log | 30.0 | 390 | 1.9407 | 0.315 | 0.7889 | 6.1398 | 0.315 | 0.1283 | 0.2696 | 0.4244 |
No log | 31.0 | 403 | 1.9099 | 0.305 | 0.7737 | 6.1311 | 0.305 | 0.1216 | 0.2626 | 0.4441 |
No log | 32.0 | 416 | 1.9071 | 0.31 | 0.7731 | 6.1004 | 0.31 | 0.1237 | 0.2803 | 0.4387 |
No log | 33.0 | 429 | 1.9097 | 0.31 | 0.7774 | 6.1658 | 0.31 | 0.1212 | 0.2701 | 0.4328 |
No log | 34.0 | 442 | 1.9008 | 0.3 | 0.7724 | 6.2049 | 0.3 | 0.1180 | 0.2415 | 0.4452 |
No log | 35.0 | 455 | 2.0340 | 0.275 | 0.8382 | 5.8659 | 0.275 | 0.1095 | 0.2873 | 0.6352 |
No log | 36.0 | 468 | 1.9324 | 0.315 | 0.7937 | 6.0328 | 0.315 | 0.1248 | 0.2865 | 0.4177 |
No log | 37.0 | 481 | 2.0698 | 0.18 | 0.8483 | 6.1172 | 0.18 | 0.0306 | 0.2448 | 0.7024 |
No log | 38.0 | 494 | 1.8436 | 0.3 | 0.7492 | 6.1508 | 0.3 | 0.1192 | 0.2461 | 0.4406 |
2.0752 | 39.0 | 507 | 1.8504 | 0.31 | 0.7556 | 6.0528 | 0.31 | 0.1222 | 0.2696 | 0.4355 |
2.0752 | 40.0 | 520 | 1.8523 | 0.315 | 0.7582 | 6.0492 | 0.315 | 0.1245 | 0.2522 | 0.4341 |
2.0752 | 41.0 | 533 | 1.8858 | 0.305 | 0.7785 | 6.1136 | 0.305 | 0.1244 | 0.2756 | 0.4559 |
2.0752 | 42.0 | 546 | 1.8466 | 0.305 | 0.7594 | 5.9124 | 0.305 | 0.1205 | 0.2739 | 0.4469 |
2.0752 | 43.0 | 559 | 1.9921 | 0.195 | 0.8300 | 5.6106 | 0.195 | 0.0490 | 0.2368 | 0.7141 |
2.0752 | 44.0 | 572 | 1.8133 | 0.31 | 0.7447 | 5.6505 | 0.31 | 0.1242 | 0.2708 | 0.4189 |
2.0752 | 45.0 | 585 | 1.8022 | 0.32 | 0.7397 | 5.6263 | 0.32 | 0.1324 | 0.2557 | 0.4213 |
2.0752 | 46.0 | 598 | 1.8361 | 0.32 | 0.7599 | 5.6068 | 0.32 | 0.1281 | 0.2719 | 0.4239 |
2.0752 | 47.0 | 611 | 1.7972 | 0.32 | 0.7376 | 5.8954 | 0.32 | 0.1306 | 0.2418 | 0.4311 |
2.0752 | 48.0 | 624 | 1.7850 | 0.325 | 0.7357 | 5.8208 | 0.325 | 0.1397 | 0.2528 | 0.3984 |
2.0752 | 49.0 | 637 | 1.7808 | 0.315 | 0.7332 | 5.5883 | 0.315 | 0.1325 | 0.2551 | 0.4255 |
2.0752 | 50.0 | 650 | 1.7838 | 0.31 | 0.7338 | 5.6850 | 0.31 | 0.1314 | 0.2530 | 0.4247 |
2.0752 | 51.0 | 663 | 1.7767 | 0.305 | 0.7316 | 5.4974 | 0.305 | 0.1241 | 0.2515 | 0.4253 |
2.0752 | 52.0 | 676 | 1.7607 | 0.32 | 0.7263 | 5.3077 | 0.32 | 0.1321 | 0.2458 | 0.4148 |
2.0752 | 53.0 | 689 | 1.7486 | 0.32 | 0.7224 | 5.1734 | 0.32 | 0.1355 | 0.2510 | 0.4190 |
2.0752 | 54.0 | 702 | 1.7693 | 0.33 | 0.7323 | 5.1578 | 0.33 | 0.1446 | 0.2638 | 0.3970 |
2.0752 | 55.0 | 715 | 1.7476 | 0.325 | 0.7235 | 5.1481 | 0.325 | 0.1602 | 0.2285 | 0.4140 |
2.0752 | 56.0 | 728 | 1.7384 | 0.31 | 0.7189 | 5.3248 | 0.31 | 0.1507 | 0.2295 | 0.4202 |
2.0752 | 57.0 | 741 | 1.7454 | 0.32 | 0.7228 | 5.2669 | 0.32 | 0.1575 | 0.2602 | 0.4218 |
2.0752 | 58.0 | 754 | 1.8063 | 0.33 | 0.7551 | 5.0652 | 0.33 | 0.1574 | 0.2835 | 0.4092 |
2.0752 | 59.0 | 767 | 1.7466 | 0.34 | 0.7237 | 4.9430 | 0.34 | 0.1783 | 0.2729 | 0.4124 |
2.0752 | 60.0 | 780 | 1.7240 | 0.345 | 0.7166 | 5.0165 | 0.345 | 0.1776 | 0.2397 | 0.4118 |
2.0752 | 61.0 | 793 | 1.7105 | 0.325 | 0.7126 | 5.0261 | 0.325 | 0.1647 | 0.2564 | 0.4149 |
2.0752 | 62.0 | 806 | 1.7078 | 0.345 | 0.7157 | 5.0160 | 0.345 | 0.1797 | 0.2612 | 0.4013 |
2.0752 | 63.0 | 819 | 1.7982 | 0.305 | 0.7575 | 4.9876 | 0.305 | 0.1614 | 0.2733 | 0.4650 |
2.0752 | 64.0 | 832 | 1.8072 | 0.33 | 0.7635 | 5.0080 | 0.33 | 0.1954 | 0.2928 | 0.4487 |
2.0752 | 65.0 | 845 | 1.7201 | 0.35 | 0.7180 | 4.8708 | 0.35 | 0.2071 | 0.2445 | 0.4114 |
2.0752 | 66.0 | 858 | 1.7131 | 0.335 | 0.7167 | 4.9248 | 0.335 | 0.1936 | 0.2531 | 0.4223 |
2.0752 | 67.0 | 871 | 1.7071 | 0.345 | 0.7138 | 4.8657 | 0.345 | 0.1948 | 0.2664 | 0.4128 |
2.0752 | 68.0 | 884 | 1.7022 | 0.36 | 0.7128 | 4.7996 | 0.36 | 0.2147 | 0.2443 | 0.4023 |
2.0752 | 69.0 | 897 | 1.6859 | 0.37 | 0.7055 | 4.7318 | 0.37 | 0.2296 | 0.2577 | 0.3909 |
2.0752 | 70.0 | 910 | 1.6860 | 0.37 | 0.7038 | 4.8293 | 0.37 | 0.2314 | 0.2594 | 0.3894 |
2.0752 | 71.0 | 923 | 1.6823 | 0.36 | 0.7038 | 4.7070 | 0.36 | 0.2170 | 0.2485 | 0.3934 |
2.0752 | 72.0 | 936 | 1.7656 | 0.335 | 0.7457 | 4.8009 | 0.335 | 0.2035 | 0.2760 | 0.4503 |
2.0752 | 73.0 | 949 | 1.8235 | 0.32 | 0.7754 | 4.7280 | 0.32 | 0.2028 | 0.2752 | 0.5244 |
2.0752 | 74.0 | 962 | 1.6878 | 0.37 | 0.7073 | 4.7660 | 0.37 | 0.2290 | 0.2455 | 0.3996 |
2.0752 | 75.0 | 975 | 1.6717 | 0.365 | 0.7003 | 4.7709 | 0.3650 | 0.2209 | 0.2404 | 0.3906 |
2.0752 | 76.0 | 988 | 1.6610 | 0.365 | 0.6972 | 4.6921 | 0.3650 | 0.2223 | 0.2640 | 0.3910 |
1.6288 | 77.0 | 1001 | 1.6740 | 0.4 | 0.7016 | 4.6791 | 0.4000 | 0.2519 | 0.2794 | 0.3693 |
1.6288 | 78.0 | 1014 | 1.6792 | 0.385 | 0.7048 | 4.7411 | 0.3850 | 0.2434 | 0.2594 | 0.3913 |
1.6288 | 79.0 | 1027 | 1.6752 | 0.395 | 0.7030 | 4.5595 | 0.395 | 0.2608 | 0.2906 | 0.3887 |
1.6288 | 80.0 | 1040 | 1.6554 | 0.395 | 0.6951 | 4.5213 | 0.395 | 0.2653 | 0.2696 | 0.3821 |
1.6288 | 81.0 | 1053 | 1.6688 | 0.385 | 0.7013 | 4.5993 | 0.3850 | 0.2441 | 0.2614 | 0.3886 |
1.6288 | 82.0 | 1066 | 1.6892 | 0.35 | 0.7121 | 4.6296 | 0.35 | 0.2187 | 0.2701 | 0.4067 |
1.6288 | 83.0 | 1079 | 1.6691 | 0.4 | 0.7031 | 4.5448 | 0.4000 | 0.2570 | 0.2845 | 0.3756 |
1.6288 | 84.0 | 1092 | 1.6544 | 0.39 | 0.6946 | 4.6295 | 0.39 | 0.2357 | 0.2522 | 0.3806 |
1.6288 | 85.0 | 1105 | 1.6592 | 0.395 | 0.6983 | 4.4632 | 0.395 | 0.2515 | 0.2793 | 0.3815 |
1.6288 | 86.0 | 1118 | 1.6526 | 0.4 | 0.6945 | 4.5685 | 0.4000 | 0.2579 | 0.2527 | 0.3781 |
1.6288 | 87.0 | 1131 | 1.6558 | 0.4 | 0.6968 | 4.5767 | 0.4000 | 0.2623 | 0.2435 | 0.3804 |
1.6288 | 88.0 | 1144 | 1.6507 | 0.395 | 0.6961 | 4.5355 | 0.395 | 0.2390 | 0.2554 | 0.3710 |
1.6288 | 89.0 | 1157 | 1.6462 | 0.4 | 0.6941 | 4.5278 | 0.4000 | 0.2525 | 0.2406 | 0.3704 |
1.6288 | 90.0 | 1170 | 1.6490 | 0.39 | 0.6954 | 4.5513 | 0.39 | 0.2430 | 0.2497 | 0.3700 |
1.6288 | 91.0 | 1183 | 1.6568 | 0.405 | 0.6980 | 4.5792 | 0.405 | 0.2545 | 0.2584 | 0.3675 |
1.6288 | 92.0 | 1196 | 1.6421 | 0.41 | 0.6909 | 4.5731 | 0.41 | 0.2666 | 0.2527 | 0.3609 |
1.6288 | 93.0 | 1209 | 1.6489 | 0.405 | 0.6952 | 4.3408 | 0.405 | 0.2695 | 0.2738 | 0.3716 |
1.6288 | 94.0 | 1222 | 1.6440 | 0.41 | 0.6933 | 4.3845 | 0.41 | 0.2713 | 0.2629 | 0.3619 |
1.6288 | 95.0 | 1235 | 1.6411 | 0.435 | 0.6919 | 4.4244 | 0.435 | 0.2878 | 0.2634 | 0.3516 |
1.6288 | 96.0 | 1248 | 1.6391 | 0.41 | 0.6918 | 4.4251 | 0.41 | 0.2628 | 0.2655 | 0.3743 |
1.6288 | 97.0 | 1261 | 1.6341 | 0.42 | 0.6893 | 4.4415 | 0.4200 | 0.2761 | 0.2549 | 0.3598 |
1.6288 | 98.0 | 1274 | 1.6476 | 0.415 | 0.6952 | 4.5149 | 0.415 | 0.2778 | 0.2385 | 0.3639 |
1.6288 | 99.0 | 1287 | 1.6463 | 0.42 | 0.6939 | 4.5027 | 0.4200 | 0.2792 | 0.2806 | 0.3593 |
1.6288 | 100.0 | 1300 | 1.6332 | 0.435 | 0.6886 | 4.4967 | 0.435 | 0.2876 | 0.2482 | 0.3432 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.