--- license: apache-2.0 base_model: ntu-spml/distilhubert tags: - generated_from_trainer metrics: - accuracy model-index: - name: distilhubert-finetuned-gtzan results: [] --- # distilhubert-finetuned-gtzan This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5086 - Accuracy: 0.89 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 4e-05 - train_batch_size: 6 - eval_batch_size: 6 - seed: 42 - gradient_accumulation_steps: 7 - total_train_batch_size: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 25 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.2912 | 0.98 | 21 | 2.2667 | 0.19 | | 2.2263 | 1.96 | 42 | 2.1460 | 0.48 | | 1.9552 | 2.99 | 64 | 1.8067 | 0.44 | | 1.5982 | 3.97 | 85 | 1.5912 | 0.54 | | 1.5182 | 4.99 | 107 | 1.4077 | 0.61 | | 1.2855 | 5.97 | 128 | 1.2654 | 0.69 | | 1.1649 | 7.0 | 150 | 1.1915 | 0.69 | | 1.0742 | 7.98 | 171 | 1.0769 | 0.75 | | 1.0495 | 8.96 | 192 | 1.0011 | 0.77 | | 0.8827 | 9.99 | 214 | 0.9062 | 0.79 | | 0.7886 | 10.97 | 235 | 0.8333 | 0.83 | | 0.7019 | 11.99 | 257 | 0.7801 | 0.83 | | 0.6642 | 12.97 | 278 | 0.7691 | 0.79 | | 0.5982 | 14.0 | 300 | 0.6984 | 0.82 | | 0.5002 | 14.98 | 321 | 0.6526 | 0.84 | | 0.4789 | 15.96 | 342 | 0.5980 | 0.88 | | 0.3908 | 16.99 | 364 | 0.5874 | 0.86 | | 0.3892 | 17.97 | 385 | 0.5570 | 0.86 | | 0.3675 | 18.99 | 407 | 0.5634 | 0.87 | | 0.303 | 19.97 | 428 | 0.5387 | 0.87 | | 0.3017 | 21.0 | 450 | 0.5086 | 0.89 | | 0.2469 | 21.98 | 471 | 0.4969 | 0.89 | | 0.2542 | 22.96 | 492 | 0.4972 | 0.88 | | 0.2651 | 23.99 | 514 | 0.4947 | 0.89 | | 0.2591 | 24.5 | 525 | 0.4929 | 0.89 | ### Framework versions - Transformers 4.35.0 - Pytorch 2.1.0 - Datasets 2.14.6 - Tokenizers 0.14.1