--- library_name: transformers license: apache-2.0 base_model: google/vit-base-patch16-224-in21k tags: - generated_from_trainer datasets: - imagefolder metrics: - accuracy - precision - recall - f1 model-index: - name: vit-base-kidney-stone-Michel_Daudon_-w256_1k_v1-_MIX results: - task: name: Image Classification type: image-classification dataset: name: imagefolder type: imagefolder config: default split: test args: default metrics: - name: Accuracy type: accuracy value: 0.83375 - name: Precision type: precision value: 0.8588680878951838 - name: Recall type: recall value: 0.83375 - name: F1 type: f1 value: 0.8355968544321966 --- # vit-base-kidney-stone-Michel_Daudon_-w256_1k_v1-_MIX This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.4940 - Accuracy: 0.8337 - Precision: 0.8589 - Recall: 0.8337 - F1: 0.8356 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 15 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 0.1919 | 0.3333 | 100 | 0.4940 | 0.8337 | 0.8589 | 0.8337 | 0.8356 | | 0.1697 | 0.6667 | 200 | 0.6993 | 0.8092 | 0.8485 | 0.8092 | 0.8059 | | 0.1514 | 1.0 | 300 | 0.5555 | 0.8442 | 0.8565 | 0.8442 | 0.8443 | | 0.0991 | 1.3333 | 400 | 0.5918 | 0.8467 | 0.8741 | 0.8467 | 0.8453 | | 0.0415 | 1.6667 | 500 | 0.6080 | 0.8558 | 0.8690 | 0.8558 | 0.8553 | | 0.1112 | 2.0 | 600 | 0.9788 | 0.7983 | 0.8485 | 0.7983 | 0.8028 | | 0.0658 | 2.3333 | 700 | 1.0272 | 0.8004 | 0.8310 | 0.8004 | 0.8002 | | 0.0977 | 2.6667 | 800 | 0.6861 | 0.8479 | 0.8570 | 0.8479 | 0.8482 | | 0.03 | 3.0 | 900 | 0.8317 | 0.8025 | 0.8225 | 0.8025 | 0.8048 | | 0.0253 | 3.3333 | 1000 | 0.8574 | 0.8242 | 0.8408 | 0.8242 | 0.8254 | | 0.0564 | 3.6667 | 1100 | 0.8591 | 0.8392 | 0.8513 | 0.8392 | 0.8343 | | 0.0285 | 4.0 | 1200 | 1.3453 | 0.7512 | 0.8090 | 0.7512 | 0.7484 | | 0.002 | 4.3333 | 1300 | 0.9746 | 0.8192 | 0.8381 | 0.8192 | 0.8227 | | 0.0214 | 4.6667 | 1400 | 0.7404 | 0.8646 | 0.8641 | 0.8646 | 0.8572 | | 0.0282 | 5.0 | 1500 | 1.0063 | 0.8233 | 0.8486 | 0.8233 | 0.8219 | | 0.03 | 5.3333 | 1600 | 1.0066 | 0.8025 | 0.8376 | 0.8025 | 0.8058 | | 0.028 | 5.6667 | 1700 | 1.1451 | 0.8108 | 0.8325 | 0.8108 | 0.8067 | | 0.0078 | 6.0 | 1800 | 1.0700 | 0.805 | 0.8220 | 0.805 | 0.8045 | | 0.0008 | 6.3333 | 1900 | 1.0180 | 0.8146 | 0.8303 | 0.8146 | 0.8165 | | 0.0008 | 6.6667 | 2000 | 0.9882 | 0.8246 | 0.8401 | 0.8246 | 0.8236 | | 0.0006 | 7.0 | 2100 | 1.0366 | 0.8283 | 0.8424 | 0.8283 | 0.8270 | | 0.0009 | 7.3333 | 2200 | 1.1136 | 0.8121 | 0.8309 | 0.8121 | 0.8143 | | 0.0068 | 7.6667 | 2300 | 1.0873 | 0.8117 | 0.8128 | 0.8117 | 0.8015 | | 0.0006 | 8.0 | 2400 | 0.8601 | 0.8325 | 0.8383 | 0.8325 | 0.8292 | | 0.0187 | 8.3333 | 2500 | 0.9700 | 0.8258 | 0.8375 | 0.8258 | 0.8241 | | 0.0005 | 8.6667 | 2600 | 0.8825 | 0.8175 | 0.8339 | 0.8175 | 0.8199 | | 0.0005 | 9.0 | 2700 | 1.0314 | 0.8242 | 0.8455 | 0.8242 | 0.8230 | | 0.0004 | 9.3333 | 2800 | 1.0323 | 0.8233 | 0.8443 | 0.8233 | 0.8230 | | 0.0003 | 9.6667 | 2900 | 1.0397 | 0.8229 | 0.8433 | 0.8229 | 0.8229 | | 0.0003 | 10.0 | 3000 | 1.0473 | 0.8237 | 0.8437 | 0.8237 | 0.8239 | | 0.0003 | 10.3333 | 3100 | 1.0536 | 0.8229 | 0.8428 | 0.8229 | 0.8233 | | 0.0003 | 10.6667 | 3200 | 1.0605 | 0.8229 | 0.8429 | 0.8229 | 0.8234 | | 0.0003 | 11.0 | 3300 | 1.0667 | 0.8229 | 0.8429 | 0.8229 | 0.8234 | | 0.0002 | 11.3333 | 3400 | 1.0711 | 0.8237 | 0.8436 | 0.8237 | 0.8243 | | 0.0002 | 11.6667 | 3500 | 1.0750 | 0.8246 | 0.8441 | 0.8246 | 0.8251 | | 0.0002 | 12.0 | 3600 | 1.0804 | 0.825 | 0.8443 | 0.825 | 0.8257 | | 0.0002 | 12.3333 | 3700 | 1.0839 | 0.825 | 0.8440 | 0.825 | 0.8257 | | 0.0002 | 12.6667 | 3800 | 1.0875 | 0.8246 | 0.8436 | 0.8246 | 0.8253 | | 0.0002 | 13.0 | 3900 | 1.0909 | 0.8246 | 0.8436 | 0.8246 | 0.8253 | | 0.0002 | 13.3333 | 4000 | 1.0930 | 0.8246 | 0.8436 | 0.8246 | 0.8253 | | 0.0002 | 13.6667 | 4100 | 1.0954 | 0.8237 | 0.8429 | 0.8237 | 0.8246 | | 0.0002 | 14.0 | 4200 | 1.0975 | 0.8237 | 0.8429 | 0.8237 | 0.8246 | | 0.0002 | 14.3333 | 4300 | 1.0988 | 0.8237 | 0.8429 | 0.8237 | 0.8246 | | 0.0002 | 14.6667 | 4400 | 1.0997 | 0.8237 | 0.8429 | 0.8237 | 0.8246 | | 0.0002 | 15.0 | 4500 | 1.1000 | 0.8237 | 0.8429 | 0.8237 | 0.8246 | ### Framework versions - Transformers 4.48.2 - Pytorch 2.6.0+cu126 - Datasets 3.2.0 - Tokenizers 0.21.0