resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.0_a0.5
This model is a fine-tuned version of bdpc/resnet101-base_tobacco on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8577
- Accuracy: 0.53
- Brier Loss: 0.6406
- Nll: 2.1208
- F1 Micro: 0.53
- F1 Macro: 0.4957
- Ece: 0.3004
- Aurc: 0.3168
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 256
- eval_batch_size: 256
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 4 | 1.4267 | 0.05 | 0.9008 | 9.6592 | 0.0500 | 0.0177 | 0.1432 | 0.9439 |
No log | 2.0 | 8 | 1.4006 | 0.155 | 0.8969 | 7.9140 | 0.155 | 0.0268 | 0.2365 | 0.9603 |
No log | 3.0 | 12 | 1.4621 | 0.155 | 0.9457 | 13.3695 | 0.155 | 0.0268 | 0.3013 | 0.9107 |
No log | 4.0 | 16 | 2.1836 | 0.155 | 1.3252 | 12.8977 | 0.155 | 0.0268 | 0.6400 | 0.7514 |
No log | 5.0 | 20 | 2.4365 | 0.155 | 1.3998 | 8.4435 | 0.155 | 0.0268 | 0.7030 | 0.6102 |
No log | 6.0 | 24 | 2.1554 | 0.155 | 1.2534 | 6.9190 | 0.155 | 0.0279 | 0.5987 | 0.6271 |
No log | 7.0 | 28 | 1.5617 | 0.175 | 0.9637 | 5.7454 | 0.175 | 0.0462 | 0.3802 | 0.6485 |
No log | 8.0 | 32 | 1.3267 | 0.245 | 0.8707 | 5.2368 | 0.245 | 0.0835 | 0.2961 | 0.5438 |
No log | 9.0 | 36 | 1.2434 | 0.19 | 0.8886 | 5.0360 | 0.19 | 0.0471 | 0.3198 | 0.7720 |
No log | 10.0 | 40 | 1.0721 | 0.305 | 0.8123 | 4.5157 | 0.305 | 0.1762 | 0.2684 | 0.5269 |
No log | 11.0 | 44 | 1.1256 | 0.22 | 0.8429 | 3.9215 | 0.22 | 0.1083 | 0.2812 | 0.7346 |
No log | 12.0 | 48 | 0.9865 | 0.35 | 0.7676 | 3.4553 | 0.35 | 0.2565 | 0.2884 | 0.4790 |
No log | 13.0 | 52 | 1.0206 | 0.355 | 0.7899 | 3.3582 | 0.3550 | 0.2278 | 0.2954 | 0.5883 |
No log | 14.0 | 56 | 0.9096 | 0.415 | 0.6994 | 3.2174 | 0.415 | 0.3147 | 0.2563 | 0.3596 |
No log | 15.0 | 60 | 0.9187 | 0.415 | 0.7129 | 3.2059 | 0.415 | 0.2742 | 0.2941 | 0.3971 |
No log | 16.0 | 64 | 0.8905 | 0.395 | 0.6956 | 2.9931 | 0.395 | 0.2618 | 0.2590 | 0.3826 |
No log | 17.0 | 68 | 0.9108 | 0.425 | 0.7073 | 3.1634 | 0.425 | 0.2855 | 0.2995 | 0.3685 |
No log | 18.0 | 72 | 0.8769 | 0.465 | 0.6706 | 3.1088 | 0.465 | 0.3652 | 0.2855 | 0.3261 |
No log | 19.0 | 76 | 0.8585 | 0.475 | 0.6687 | 2.8710 | 0.4750 | 0.3884 | 0.2916 | 0.3282 |
No log | 20.0 | 80 | 0.9822 | 0.405 | 0.7378 | 2.8889 | 0.405 | 0.3570 | 0.2850 | 0.4895 |
No log | 21.0 | 84 | 0.9324 | 0.445 | 0.6992 | 2.7975 | 0.445 | 0.3553 | 0.3021 | 0.3762 |
No log | 22.0 | 88 | 1.0330 | 0.42 | 0.7350 | 2.7487 | 0.4200 | 0.3506 | 0.2984 | 0.4771 |
No log | 23.0 | 92 | 0.8755 | 0.455 | 0.6674 | 2.5903 | 0.455 | 0.3415 | 0.2570 | 0.3352 |
No log | 24.0 | 96 | 0.8651 | 0.47 | 0.6443 | 2.8456 | 0.47 | 0.3800 | 0.2451 | 0.2975 |
No log | 25.0 | 100 | 0.9567 | 0.445 | 0.7150 | 2.7083 | 0.445 | 0.3727 | 0.2667 | 0.4676 |
No log | 26.0 | 104 | 1.0224 | 0.42 | 0.7376 | 2.4408 | 0.4200 | 0.3367 | 0.2968 | 0.5019 |
No log | 27.0 | 108 | 0.8365 | 0.525 | 0.6407 | 2.6426 | 0.525 | 0.4496 | 0.2960 | 0.2657 |
No log | 28.0 | 112 | 0.9798 | 0.425 | 0.7287 | 2.6379 | 0.425 | 0.3489 | 0.2640 | 0.4668 |
No log | 29.0 | 116 | 0.9226 | 0.44 | 0.6965 | 2.5748 | 0.44 | 0.3669 | 0.2561 | 0.4054 |
No log | 30.0 | 120 | 0.8303 | 0.49 | 0.6398 | 2.4839 | 0.49 | 0.3924 | 0.2981 | 0.2936 |
No log | 31.0 | 124 | 0.8426 | 0.52 | 0.6478 | 2.5282 | 0.52 | 0.4322 | 0.3109 | 0.3084 |
No log | 32.0 | 128 | 0.9111 | 0.45 | 0.6970 | 2.3870 | 0.45 | 0.3947 | 0.2837 | 0.4448 |
No log | 33.0 | 132 | 0.8723 | 0.51 | 0.6524 | 2.6124 | 0.51 | 0.4170 | 0.2536 | 0.3365 |
No log | 34.0 | 136 | 0.8936 | 0.47 | 0.6671 | 2.8892 | 0.47 | 0.3814 | 0.2436 | 0.3357 |
No log | 35.0 | 140 | 1.2870 | 0.42 | 0.7660 | 4.4020 | 0.4200 | 0.3468 | 0.2860 | 0.4606 |
No log | 36.0 | 144 | 0.9991 | 0.455 | 0.7289 | 2.6973 | 0.455 | 0.4132 | 0.3272 | 0.4684 |
No log | 37.0 | 148 | 1.6352 | 0.365 | 0.8356 | 4.7695 | 0.3650 | 0.3020 | 0.3312 | 0.6069 |
No log | 38.0 | 152 | 1.3014 | 0.39 | 0.8213 | 2.9436 | 0.39 | 0.3382 | 0.3262 | 0.5476 |
No log | 39.0 | 156 | 1.0294 | 0.415 | 0.7361 | 2.7188 | 0.415 | 0.3446 | 0.2454 | 0.4632 |
No log | 40.0 | 160 | 0.8825 | 0.52 | 0.6538 | 2.3887 | 0.52 | 0.4608 | 0.2721 | 0.3186 |
No log | 41.0 | 164 | 0.8572 | 0.54 | 0.6288 | 2.4201 | 0.54 | 0.4822 | 0.2963 | 0.2899 |
No log | 42.0 | 168 | 0.8393 | 0.535 | 0.6291 | 2.3587 | 0.535 | 0.4726 | 0.2824 | 0.2937 |
No log | 43.0 | 172 | 0.8369 | 0.515 | 0.6303 | 2.4060 | 0.515 | 0.4583 | 0.2689 | 0.2903 |
No log | 44.0 | 176 | 0.8458 | 0.49 | 0.6346 | 2.3323 | 0.49 | 0.4428 | 0.2526 | 0.2951 |
No log | 45.0 | 180 | 0.8446 | 0.49 | 0.6367 | 2.2207 | 0.49 | 0.4289 | 0.2655 | 0.3041 |
No log | 46.0 | 184 | 0.8324 | 0.54 | 0.6289 | 2.3685 | 0.54 | 0.4779 | 0.2571 | 0.2873 |
No log | 47.0 | 188 | 0.8658 | 0.515 | 0.6486 | 2.3922 | 0.515 | 0.4584 | 0.2623 | 0.3100 |
No log | 48.0 | 192 | 0.8516 | 0.525 | 0.6410 | 2.4448 | 0.525 | 0.4700 | 0.3006 | 0.3044 |
No log | 49.0 | 196 | 0.8520 | 0.55 | 0.6350 | 2.2049 | 0.55 | 0.4947 | 0.3030 | 0.2980 |
No log | 50.0 | 200 | 0.8577 | 0.53 | 0.6406 | 2.1208 | 0.53 | 0.4957 | 0.3004 | 0.3168 |
Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.2.0.dev20231112+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
- Downloads last month
- 161
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.