|
--- |
|
license: apache-2.0 |
|
base_model: facebook/convnextv2-tiny-1k-224 |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- imagefolder |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: convnextv2-tiny-1k-224-finetuned-fullwear |
|
results: |
|
- task: |
|
name: Image Classification |
|
type: image-classification |
|
dataset: |
|
name: imagefolder |
|
type: imagefolder |
|
config: default |
|
split: train |
|
args: default |
|
metrics: |
|
- name: Accuracy |
|
type: accuracy |
|
value: 0.8402777777777778 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# convnextv2-tiny-1k-224-finetuned-fullwear |
|
|
|
This model is a fine-tuned version of [facebook/convnextv2-tiny-1k-224](https://huggingface.co/facebook/convnextv2-tiny-1k-224) on the imagefolder dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.5203 |
|
- Accuracy: 0.8403 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 4 |
|
- total_train_batch_size: 128 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 120 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:--------:|:----:|:---------------:|:--------:| |
|
| 2.4871 | 0.9756 | 10 | 2.4771 | 0.0694 | |
|
| 2.4464 | 1.9512 | 20 | 2.4333 | 0.1528 | |
|
| 2.3911 | 2.9268 | 30 | 2.3670 | 0.2778 | |
|
| 2.3204 | 4.0 | 41 | 2.2617 | 0.3681 | |
|
| 2.206 | 4.9756 | 51 | 2.1445 | 0.3958 | |
|
| 2.0869 | 5.9512 | 61 | 2.0146 | 0.4444 | |
|
| 1.9756 | 6.9268 | 71 | 1.8763 | 0.5139 | |
|
| 1.8124 | 8.0 | 82 | 1.7422 | 0.5486 | |
|
| 1.6624 | 8.9756 | 92 | 1.6629 | 0.5903 | |
|
| 1.587 | 9.9512 | 102 | 1.5474 | 0.6111 | |
|
| 1.4746 | 10.9268 | 112 | 1.4577 | 0.625 | |
|
| 1.359 | 12.0 | 123 | 1.3055 | 0.6736 | |
|
| 1.2412 | 12.9756 | 133 | 1.2241 | 0.6736 | |
|
| 1.1374 | 13.9512 | 143 | 1.2003 | 0.6736 | |
|
| 1.0194 | 14.9268 | 153 | 1.0233 | 0.7569 | |
|
| 0.9705 | 16.0 | 164 | 0.9492 | 0.7847 | |
|
| 0.8949 | 16.9756 | 174 | 0.9246 | 0.75 | |
|
| 0.7959 | 17.9512 | 184 | 0.8148 | 0.7639 | |
|
| 0.7491 | 18.9268 | 194 | 0.7858 | 0.7569 | |
|
| 0.6783 | 20.0 | 205 | 0.8010 | 0.7569 | |
|
| 0.6257 | 20.9756 | 215 | 0.7295 | 0.7847 | |
|
| 0.5999 | 21.9512 | 225 | 0.6219 | 0.8333 | |
|
| 0.5701 | 22.9268 | 235 | 0.5932 | 0.8403 | |
|
| 0.4926 | 24.0 | 246 | 0.5970 | 0.8056 | |
|
| 0.4692 | 24.9756 | 256 | 0.6298 | 0.8194 | |
|
| 0.4393 | 25.9512 | 266 | 0.5857 | 0.8056 | |
|
| 0.419 | 26.9268 | 276 | 0.5203 | 0.8542 | |
|
| 0.3454 | 28.0 | 287 | 0.6084 | 0.8264 | |
|
| 0.36 | 28.9756 | 297 | 0.5928 | 0.8264 | |
|
| 0.3265 | 29.9512 | 307 | 0.5303 | 0.8403 | |
|
| 0.3278 | 30.9268 | 317 | 0.6049 | 0.8194 | |
|
| 0.2766 | 32.0 | 328 | 0.5656 | 0.8264 | |
|
| 0.2805 | 32.9756 | 338 | 0.5003 | 0.8681 | |
|
| 0.2505 | 33.9512 | 348 | 0.5412 | 0.8403 | |
|
| 0.2464 | 34.9268 | 358 | 0.5410 | 0.8333 | |
|
| 0.2166 | 36.0 | 369 | 0.5000 | 0.8472 | |
|
| 0.2 | 36.9756 | 379 | 0.5053 | 0.8056 | |
|
| 0.1914 | 37.9512 | 389 | 0.5161 | 0.8403 | |
|
| 0.186 | 38.9268 | 399 | 0.4242 | 0.8681 | |
|
| 0.1592 | 40.0 | 410 | 0.5059 | 0.8472 | |
|
| 0.1598 | 40.9756 | 420 | 0.5143 | 0.8264 | |
|
| 0.1565 | 41.9512 | 430 | 0.4703 | 0.8542 | |
|
| 0.1598 | 42.9268 | 440 | 0.4384 | 0.8542 | |
|
| 0.139 | 44.0 | 451 | 0.4850 | 0.8403 | |
|
| 0.1137 | 44.9756 | 461 | 0.4405 | 0.8542 | |
|
| 0.1158 | 45.9512 | 471 | 0.5250 | 0.8333 | |
|
| 0.1192 | 46.9268 | 481 | 0.5843 | 0.8194 | |
|
| 0.1271 | 48.0 | 492 | 0.4498 | 0.8611 | |
|
| 0.0914 | 48.9756 | 502 | 0.5167 | 0.8264 | |
|
| 0.1079 | 49.9512 | 512 | 0.4648 | 0.8681 | |
|
| 0.091 | 50.9268 | 522 | 0.5321 | 0.8194 | |
|
| 0.1053 | 52.0 | 533 | 0.4402 | 0.8611 | |
|
| 0.0842 | 52.9756 | 543 | 0.4776 | 0.8542 | |
|
| 0.0961 | 53.9512 | 553 | 0.4762 | 0.8681 | |
|
| 0.0896 | 54.9268 | 563 | 0.4477 | 0.8681 | |
|
| 0.0876 | 56.0 | 574 | 0.4951 | 0.8472 | |
|
| 0.0855 | 56.9756 | 584 | 0.5653 | 0.8125 | |
|
| 0.073 | 57.9512 | 594 | 0.5315 | 0.8472 | |
|
| 0.0804 | 58.9268 | 604 | 0.5064 | 0.8681 | |
|
| 0.0765 | 60.0 | 615 | 0.6316 | 0.8264 | |
|
| 0.0782 | 60.9756 | 625 | 0.5733 | 0.8056 | |
|
| 0.069 | 61.9512 | 635 | 0.6994 | 0.8056 | |
|
| 0.0809 | 62.9268 | 645 | 0.4898 | 0.8611 | |
|
| 0.0829 | 64.0 | 656 | 0.6042 | 0.8194 | |
|
| 0.0735 | 64.9756 | 666 | 0.4758 | 0.8611 | |
|
| 0.0763 | 65.9512 | 676 | 0.4921 | 0.8542 | |
|
| 0.0565 | 66.9268 | 686 | 0.4700 | 0.8681 | |
|
| 0.062 | 68.0 | 697 | 0.4944 | 0.8819 | |
|
| 0.0644 | 68.9756 | 707 | 0.4733 | 0.8681 | |
|
| 0.0659 | 69.9512 | 717 | 0.4703 | 0.8819 | |
|
| 0.0625 | 70.9268 | 727 | 0.5075 | 0.8542 | |
|
| 0.042 | 72.0 | 738 | 0.5464 | 0.8264 | |
|
| 0.056 | 72.9756 | 748 | 0.5186 | 0.8333 | |
|
| 0.0858 | 73.9512 | 758 | 0.5403 | 0.8264 | |
|
| 0.0616 | 74.9268 | 768 | 0.5104 | 0.8472 | |
|
| 0.0777 | 76.0 | 779 | 0.5516 | 0.8403 | |
|
| 0.0668 | 76.9756 | 789 | 0.4918 | 0.8611 | |
|
| 0.0585 | 77.9512 | 799 | 0.5692 | 0.8403 | |
|
| 0.0562 | 78.9268 | 809 | 0.5734 | 0.8403 | |
|
| 0.0653 | 80.0 | 820 | 0.5403 | 0.8264 | |
|
| 0.0434 | 80.9756 | 830 | 0.5108 | 0.8333 | |
|
| 0.0483 | 81.9512 | 840 | 0.5699 | 0.8125 | |
|
| 0.0329 | 82.9268 | 850 | 0.6028 | 0.8056 | |
|
| 0.0431 | 84.0 | 861 | 0.5230 | 0.8333 | |
|
| 0.042 | 84.9756 | 871 | 0.5875 | 0.8194 | |
|
| 0.0449 | 85.9512 | 881 | 0.5180 | 0.8611 | |
|
| 0.0512 | 86.9268 | 891 | 0.5425 | 0.8194 | |
|
| 0.0545 | 88.0 | 902 | 0.5690 | 0.8264 | |
|
| 0.0496 | 88.9756 | 912 | 0.5619 | 0.8611 | |
|
| 0.0449 | 89.9512 | 922 | 0.5626 | 0.8333 | |
|
| 0.0405 | 90.9268 | 932 | 0.5267 | 0.8403 | |
|
| 0.0344 | 92.0 | 943 | 0.5617 | 0.8403 | |
|
| 0.0421 | 92.9756 | 953 | 0.5400 | 0.8611 | |
|
| 0.0341 | 93.9512 | 963 | 0.5729 | 0.8333 | |
|
| 0.0492 | 94.9268 | 973 | 0.5855 | 0.8056 | |
|
| 0.0374 | 96.0 | 984 | 0.6113 | 0.8125 | |
|
| 0.0375 | 96.9756 | 994 | 0.5511 | 0.8403 | |
|
| 0.0373 | 97.9512 | 1004 | 0.4942 | 0.8542 | |
|
| 0.0447 | 98.9268 | 1014 | 0.5031 | 0.8542 | |
|
| 0.0519 | 100.0 | 1025 | 0.5349 | 0.8542 | |
|
| 0.0387 | 100.9756 | 1035 | 0.5511 | 0.8542 | |
|
| 0.0256 | 101.9512 | 1045 | 0.5319 | 0.8403 | |
|
| 0.043 | 102.9268 | 1055 | 0.5605 | 0.8264 | |
|
| 0.029 | 104.0 | 1066 | 0.5776 | 0.8403 | |
|
| 0.0379 | 104.9756 | 1076 | 0.5697 | 0.8472 | |
|
| 0.0445 | 105.9512 | 1086 | 0.5133 | 0.8681 | |
|
| 0.0267 | 106.9268 | 1096 | 0.5076 | 0.8681 | |
|
| 0.044 | 108.0 | 1107 | 0.5260 | 0.8403 | |
|
| 0.0263 | 108.9756 | 1117 | 0.5101 | 0.8542 | |
|
| 0.0247 | 109.9512 | 1127 | 0.4972 | 0.8542 | |
|
| 0.0441 | 110.9268 | 1137 | 0.5094 | 0.8472 | |
|
| 0.0263 | 112.0 | 1148 | 0.5259 | 0.8333 | |
|
| 0.0247 | 112.9756 | 1158 | 0.5323 | 0.8403 | |
|
| 0.0356 | 113.9512 | 1168 | 0.5275 | 0.8403 | |
|
| 0.0297 | 114.9268 | 1178 | 0.5240 | 0.8333 | |
|
| 0.044 | 116.0 | 1189 | 0.5201 | 0.8472 | |
|
| 0.031 | 116.9756 | 1199 | 0.5203 | 0.8403 | |
|
| 0.0369 | 117.0732 | 1200 | 0.5203 | 0.8403 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.44.0 |
|
- Pytorch 2.4.0 |
|
- Datasets 2.21.0 |
|
- Tokenizers 0.19.1 |
|
|