--- license: apache-2.0 library_name: peft tags: - generated_from_trainer base_model: facebook/deit-base-patch16-224 datasets: - medmnist-v2 metrics: - accuracy - precision - recall - f1 model-index: - name: organamnist-deit-base-finetuned results: [] --- # organamnist-deit-base-finetuned This model is a fine-tuned version of [facebook/deit-base-patch16-224](https://huggingface.co/facebook/deit-base-patch16-224) on the medmnist-v2 dataset. It achieves the following results on the evaluation set: - Loss: 0.1907 - Accuracy: 0.9424 - Precision: 0.9464 - Recall: 0.9395 - F1: 0.9421 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.005 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 0.5849 | 1.0 | 540 | 0.1842 | 0.9442 | 0.9449 | 0.9268 | 0.9285 | | 0.6494 | 2.0 | 1081 | 0.1433 | 0.9499 | 0.9539 | 0.9510 | 0.9509 | | 0.6059 | 3.0 | 1621 | 0.1171 | 0.9562 | 0.9659 | 0.9569 | 0.9593 | | 0.3547 | 4.0 | 2162 | 0.0981 | 0.9666 | 0.9709 | 0.9712 | 0.9702 | | 0.4852 | 5.0 | 2702 | 0.0539 | 0.9817 | 0.9848 | 0.9842 | 0.9842 | | 0.406 | 6.0 | 3243 | 0.0818 | 0.9749 | 0.9793 | 0.9752 | 0.9768 | | 0.3074 | 7.0 | 3783 | 0.1289 | 0.9666 | 0.9815 | 0.9778 | 0.9783 | | 0.2679 | 8.0 | 4324 | 0.0311 | 0.9900 | 0.9916 | 0.9909 | 0.9912 | | 0.2439 | 9.0 | 4864 | 0.0577 | 0.9851 | 0.9886 | 0.9880 | 0.9881 | | 0.2169 | 9.99 | 5400 | 0.0720 | 0.9835 | 0.9888 | 0.9882 | 0.9882 | ### Framework versions - PEFT 0.11.1 - Transformers 4.39.3 - Pytorch 2.1.2 - Datasets 2.18.0 - Tokenizers 0.15.2