bdpc's picture
Saving best model to hub
b7835a8
---
license: apache-2.0
base_model: microsoft/resnet-50
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.7
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.7
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7861
- Accuracy: 0.705
- Brier Loss: 0.4410
- Nll: 2.6519
- F1 Micro: 0.705
- F1 Macro: 0.6403
- Ece: 0.2724
- Aurc: 0.1188
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.7831 | 0.165 | 0.8966 | 8.4414 | 0.165 | 0.1121 | 0.2151 | 0.8335 |
| No log | 2.0 | 26 | 1.7753 | 0.145 | 0.8958 | 8.5715 | 0.145 | 0.0954 | 0.1998 | 0.8332 |
| No log | 3.0 | 39 | 1.7334 | 0.175 | 0.8877 | 6.4682 | 0.175 | 0.0756 | 0.2069 | 0.7896 |
| No log | 4.0 | 52 | 1.6604 | 0.185 | 0.8723 | 6.0351 | 0.185 | 0.0505 | 0.2328 | 0.7549 |
| No log | 5.0 | 65 | 1.5874 | 0.18 | 0.8560 | 6.0732 | 0.18 | 0.0431 | 0.2285 | 0.7506 |
| No log | 6.0 | 78 | 1.5223 | 0.185 | 0.8415 | 6.1638 | 0.185 | 0.0479 | 0.2419 | 0.7530 |
| No log | 7.0 | 91 | 1.4642 | 0.35 | 0.8239 | 6.0328 | 0.35 | 0.1696 | 0.3081 | 0.5219 |
| No log | 8.0 | 104 | 1.3599 | 0.35 | 0.7825 | 6.2102 | 0.35 | 0.1908 | 0.2977 | 0.4172 |
| No log | 9.0 | 117 | 1.3083 | 0.385 | 0.7566 | 5.7128 | 0.3850 | 0.2203 | 0.3012 | 0.3842 |
| No log | 10.0 | 130 | 1.3151 | 0.365 | 0.7670 | 5.1073 | 0.3650 | 0.2150 | 0.2923 | 0.4891 |
| No log | 11.0 | 143 | 1.3736 | 0.295 | 0.7950 | 5.3584 | 0.295 | 0.1747 | 0.2716 | 0.6360 |
| No log | 12.0 | 156 | 1.2655 | 0.425 | 0.7380 | 4.0312 | 0.425 | 0.2789 | 0.3273 | 0.3366 |
| No log | 13.0 | 169 | 1.1696 | 0.475 | 0.6901 | 3.9627 | 0.4750 | 0.3083 | 0.3011 | 0.2825 |
| No log | 14.0 | 182 | 1.2992 | 0.355 | 0.7473 | 3.9098 | 0.3550 | 0.2292 | 0.2675 | 0.4929 |
| No log | 15.0 | 195 | 1.1698 | 0.51 | 0.6881 | 3.7143 | 0.51 | 0.3691 | 0.3333 | 0.3278 |
| No log | 16.0 | 208 | 1.0624 | 0.515 | 0.6274 | 3.8387 | 0.515 | 0.3631 | 0.2821 | 0.2583 |
| No log | 17.0 | 221 | 1.0970 | 0.565 | 0.6421 | 3.3302 | 0.565 | 0.4493 | 0.3362 | 0.2373 |
| No log | 18.0 | 234 | 1.0029 | 0.625 | 0.5883 | 3.3820 | 0.625 | 0.4675 | 0.3005 | 0.1660 |
| No log | 19.0 | 247 | 1.0384 | 0.605 | 0.6093 | 3.3183 | 0.605 | 0.4863 | 0.3252 | 0.2145 |
| No log | 20.0 | 260 | 1.0686 | 0.62 | 0.6234 | 3.0246 | 0.62 | 0.5155 | 0.3625 | 0.2334 |
| No log | 21.0 | 273 | 0.9641 | 0.62 | 0.5685 | 2.9225 | 0.62 | 0.5259 | 0.3103 | 0.2063 |
| No log | 22.0 | 286 | 1.0054 | 0.665 | 0.5849 | 3.0792 | 0.665 | 0.5614 | 0.3636 | 0.1863 |
| No log | 23.0 | 299 | 0.9959 | 0.675 | 0.5734 | 2.9829 | 0.675 | 0.5577 | 0.3619 | 0.1806 |
| No log | 24.0 | 312 | 0.9044 | 0.675 | 0.5267 | 2.8952 | 0.675 | 0.5712 | 0.2989 | 0.1475 |
| No log | 25.0 | 325 | 0.9803 | 0.655 | 0.5627 | 2.7501 | 0.655 | 0.5418 | 0.3415 | 0.1919 |
| No log | 26.0 | 338 | 0.8814 | 0.65 | 0.5176 | 2.8421 | 0.65 | 0.5619 | 0.2665 | 0.1694 |
| No log | 27.0 | 351 | 0.8555 | 0.69 | 0.4928 | 2.7870 | 0.69 | 0.5831 | 0.3091 | 0.1279 |
| No log | 28.0 | 364 | 0.8290 | 0.69 | 0.4777 | 2.6377 | 0.69 | 0.5976 | 0.2551 | 0.1290 |
| No log | 29.0 | 377 | 0.8593 | 0.685 | 0.4949 | 2.5880 | 0.685 | 0.5776 | 0.3083 | 0.1279 |
| No log | 30.0 | 390 | 0.8226 | 0.685 | 0.4678 | 2.8938 | 0.685 | 0.5884 | 0.2820 | 0.1249 |
| No log | 31.0 | 403 | 0.8578 | 0.69 | 0.4857 | 2.6150 | 0.69 | 0.6024 | 0.3109 | 0.1344 |
| No log | 32.0 | 416 | 0.8330 | 0.685 | 0.4753 | 2.5999 | 0.685 | 0.6047 | 0.2688 | 0.1407 |
| No log | 33.0 | 429 | 0.8268 | 0.7 | 0.4683 | 2.6138 | 0.7 | 0.6193 | 0.2913 | 0.1315 |
| No log | 34.0 | 442 | 0.8535 | 0.715 | 0.4749 | 2.5059 | 0.715 | 0.6450 | 0.2931 | 0.1190 |
| No log | 35.0 | 455 | 0.8334 | 0.665 | 0.4752 | 2.3839 | 0.665 | 0.5950 | 0.2762 | 0.1397 |
| No log | 36.0 | 468 | 0.8025 | 0.71 | 0.4553 | 2.4803 | 0.7100 | 0.6302 | 0.2889 | 0.1178 |
| No log | 37.0 | 481 | 0.8142 | 0.715 | 0.4563 | 2.6785 | 0.715 | 0.6426 | 0.2989 | 0.1048 |
| No log | 38.0 | 494 | 0.8124 | 0.7 | 0.4538 | 2.5320 | 0.7 | 0.6332 | 0.2594 | 0.1132 |
| 0.9303 | 39.0 | 507 | 0.7888 | 0.69 | 0.4452 | 2.6427 | 0.69 | 0.6269 | 0.2583 | 0.1224 |
| 0.9303 | 40.0 | 520 | 0.7907 | 0.705 | 0.4458 | 2.6942 | 0.705 | 0.6367 | 0.2688 | 0.1155 |
| 0.9303 | 41.0 | 533 | 0.7918 | 0.71 | 0.4442 | 2.4378 | 0.7100 | 0.6558 | 0.2816 | 0.1132 |
| 0.9303 | 42.0 | 546 | 0.8005 | 0.725 | 0.4479 | 2.6088 | 0.7250 | 0.6576 | 0.2914 | 0.1049 |
| 0.9303 | 43.0 | 559 | 0.7879 | 0.72 | 0.4421 | 2.7052 | 0.72 | 0.6592 | 0.2741 | 0.1122 |
| 0.9303 | 44.0 | 572 | 0.7910 | 0.71 | 0.4461 | 2.6463 | 0.7100 | 0.6463 | 0.3119 | 0.1188 |
| 0.9303 | 45.0 | 585 | 0.7922 | 0.705 | 0.4450 | 2.6453 | 0.705 | 0.6481 | 0.2753 | 0.1211 |
| 0.9303 | 46.0 | 598 | 0.7915 | 0.715 | 0.4429 | 2.6970 | 0.715 | 0.6526 | 0.2741 | 0.1107 |
| 0.9303 | 47.0 | 611 | 0.7809 | 0.705 | 0.4370 | 2.6841 | 0.705 | 0.6453 | 0.2734 | 0.1158 |
| 0.9303 | 48.0 | 624 | 0.7771 | 0.705 | 0.4350 | 2.6168 | 0.705 | 0.6423 | 0.2652 | 0.1139 |
| 0.9303 | 49.0 | 637 | 0.7826 | 0.705 | 0.4377 | 2.5091 | 0.705 | 0.6423 | 0.2758 | 0.1202 |
| 0.9303 | 50.0 | 650 | 0.7861 | 0.705 | 0.4410 | 2.6519 | 0.705 | 0.6403 | 0.2724 | 0.1188 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3