metadata
library_name: transformers
license: apache-2.0
base_model: microsoft/resnet-50
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: resnet-50-finetuned-barkley
results: []
resnet-50-finetuned-barkley
This model is a fine-tuned version of microsoft/resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.9221
- Precision: 0.8780
- Recall: 0.8618
- F1: 0.8574
- Accuracy: 0.8744
- Top1 Accuracy: 0.8618
- Error Rate: 0.1256
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Top1 Accuracy | Error Rate |
---|---|---|---|---|---|---|---|---|---|
1.6171 | 1.0 | 38 | 1.6195 | 0.0663 | 0.1513 | 0.0664 | 0.1738 | 0.1513 | 0.8262 |
1.6149 | 2.0 | 76 | 1.6160 | 0.2953 | 0.1579 | 0.0802 | 0.1785 | 0.1579 | 0.8215 |
1.6119 | 3.0 | 114 | 1.6112 | 0.0804 | 0.1579 | 0.0834 | 0.1772 | 0.1579 | 0.8228 |
1.6041 | 4.0 | 152 | 1.6015 | 0.4161 | 0.1974 | 0.1461 | 0.2155 | 0.1974 | 0.7845 |
1.5945 | 5.0 | 190 | 1.5895 | 0.4089 | 0.2895 | 0.2428 | 0.3092 | 0.2895 | 0.6908 |
1.5777 | 6.0 | 228 | 1.5710 | 0.5764 | 0.4408 | 0.3944 | 0.4663 | 0.4408 | 0.5337 |
1.561 | 7.0 | 266 | 1.5490 | 0.6013 | 0.4934 | 0.4516 | 0.5173 | 0.5 | 0.4827 |
1.536 | 8.0 | 304 | 1.5222 | 0.6377 | 0.5132 | 0.4711 | 0.5450 | 0.5132 | 0.4550 |
1.5081 | 9.0 | 342 | 1.4912 | 0.7595 | 0.5987 | 0.5869 | 0.6250 | 0.5987 | 0.3750 |
1.4756 | 10.0 | 380 | 1.4566 | 0.7579 | 0.6447 | 0.6293 | 0.6683 | 0.6447 | 0.3317 |
1.4387 | 11.0 | 418 | 1.4156 | 0.7914 | 0.6776 | 0.6692 | 0.6985 | 0.6776 | 0.3015 |
1.3993 | 12.0 | 456 | 1.3737 | 0.7997 | 0.6842 | 0.6732 | 0.7080 | 0.6842 | 0.2920 |
1.358 | 13.0 | 494 | 1.3288 | 0.8290 | 0.7039 | 0.7048 | 0.7232 | 0.7039 | 0.2768 |
1.3139 | 14.0 | 532 | 1.2806 | 0.8277 | 0.7434 | 0.7373 | 0.7592 | 0.75 | 0.2408 |
1.262 | 15.0 | 570 | 1.2345 | 0.8478 | 0.7697 | 0.7664 | 0.7829 | 0.7697 | 0.2171 |
1.2184 | 16.0 | 608 | 1.1887 | 0.8323 | 0.7697 | 0.7654 | 0.7818 | 0.7697 | 0.2182 |
1.1803 | 17.0 | 646 | 1.1408 | 0.8423 | 0.7763 | 0.7735 | 0.7931 | 0.7763 | 0.2069 |
1.1422 | 18.0 | 684 | 1.0966 | 0.8594 | 0.8158 | 0.8100 | 0.8317 | 0.8158 | 0.1683 |
1.1032 | 19.0 | 722 | 1.0587 | 0.8431 | 0.8026 | 0.7969 | 0.8145 | 0.8026 | 0.1855 |
1.058 | 20.0 | 760 | 1.0289 | 0.8610 | 0.8355 | 0.8301 | 0.8487 | 0.8355 | 0.1513 |
1.0252 | 21.0 | 798 | 0.9918 | 0.8576 | 0.8421 | 0.8370 | 0.8534 | 0.8421 | 0.1466 |
1.002 | 22.0 | 836 | 0.9727 | 0.8677 | 0.8487 | 0.8435 | 0.8611 | 0.8487 | 0.1389 |
0.9812 | 23.0 | 874 | 0.9465 | 0.8795 | 0.8553 | 0.8497 | 0.8678 | 0.8553 | 0.1322 |
0.9636 | 24.0 | 912 | 0.9331 | 0.8820 | 0.8553 | 0.8485 | 0.8699 | 0.8553 | 0.1301 |
0.9591 | 25.0 | 950 | 0.9221 | 0.8780 | 0.8618 | 0.8574 | 0.8744 | 0.8618 | 0.1256 |
0.948 | 26.0 | 988 | 0.9158 | 0.8780 | 0.8618 | 0.8574 | 0.8744 | 0.8684 | 0.1256 |
0.9384 | 27.0 | 1026 | 0.9017 | 0.8685 | 0.8487 | 0.8431 | 0.8601 | 0.8487 | 0.1399 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.5.0+cu121
- Datasets 3.0.1
- Tokenizers 0.20.1