--- library_name: transformers license: apache-2.0 base_model: openai/whisper-large-v3 tags: - whisper-event - generated_from_trainer datasets: - asierhv/composite_corpus_eu_v2.1 language: - eu metrics: - wer model-index: - name: Whisper Large Basque results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Common Voice 17.0 type: mozilla-foundation/common_voice_17_0 config: eu split: test args: language: eu metrics: - name: Test WER type: wer value: 4.47 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: asierhv/composite_corpus_eu_v2.1 type: asierhv/composite_corpus_eu_v2.1 metrics: - name: Wer type: wer value: 7.100121529400767 --- # Whisper Large Basque This model is a fine-tuned version of [openai/whisper-large-v3](https://huggingface.co/openai/whisper-large-v3) on the asierhv/composite_corpus_eu_v2.1 dataset. It achieves the following results on the evaluation set: - Loss: 0.1407 - Wer: 7.1001 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 4.375e-06 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - training_steps: 20000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:-------:| | 0.2854 | 0.05 | 500 | 0.3763 | 24.9836 | | 0.1425 | 0.1 | 1000 | 0.3326 | 19.8654 | | 0.2196 | 0.15 | 1500 | 0.2802 | 16.2475 | | 0.2338 | 0.2 | 2000 | 0.2536 | 14.6116 | | 0.1383 | 0.25 | 2500 | 0.2451 | 12.8961 | | 0.0848 | 0.3 | 3000 | 0.2280 | 12.2464 | | 0.0854 | 0.35 | 3500 | 0.2152 | 11.4144 | | 0.1304 | 0.4 | 4000 | 0.2097 | 11.1433 | | 0.1328 | 0.45 | 4500 | 0.2055 | 10.6011 | | 0.0737 | 0.5 | 5000 | 0.2079 | 10.5357 | | 0.0804 | 0.55 | 5500 | 0.2133 | 10.1150 | | 0.0964 | 0.6 | 6000 | 0.1988 | 9.4606 | | 0.0811 | 0.65 | 6500 | 0.2019 | 9.4933 | | 0.0677 | 0.7 | 7000 | 0.1916 | 8.9231 | | 0.1114 | 0.75 | 7500 | 0.2029 | 9.3250 | | 0.1142 | 0.8 | 8000 | 0.1895 | 8.9978 | | 0.0466 | 0.85 | 8500 | 0.1936 | 8.8576 | | 0.0664 | 0.9 | 9000 | 0.1876 | 8.9698 | | 0.0759 | 0.95 | 9500 | 0.1827 | 8.8202 | | 0.0555 | 1.0 | 10000 | 0.1834 | 8.6426 | | 0.0603 | 0.525 | 10500 | 0.1872 | 9.3344 | | 0.0727 | 0.55 | 11000 | 0.1838 | 9.3624 | | 0.0523 | 0.575 | 11500 | 0.2022 | 8.8903 | | 0.0719 | 0.6 | 12000 | 0.1840 | 9.0072 | | 0.0505 | 0.625 | 12500 | 0.1860 | 8.5631 | | 0.0678 | 0.65 | 13000 | 0.1852 | 8.1238 | | 0.0586 | 0.675 | 13500 | 0.1888 | 8.7641 | | 0.0818 | 0.7 | 14000 | 0.1822 | 8.2547 | | 0.0583 | 0.725 | 14500 | 0.1349 | 7.8760 | | 0.0516 | 0.75 | 15000 | 0.1432 | 7.8386 | | 0.0721 | 0.775 | 15500 | 0.1439 | 7.7966 | | 0.0697 | 0.8 | 16000 | 0.1345 | 7.6470 | | 0.0459 | 0.825 | 16500 | 0.1381 | 7.4881 | | 0.0533 | 0.85 | 17000 | 0.1422 | 7.2871 | | 0.0449 | 0.875 | 17500 | 0.1426 | 7.7218 | | 0.0424 | 0.9 | 18000 | 0.1417 | 7.4367 | | 0.0714 | 0.925 | 18500 | 0.1337 | 6.9973 | | 0.0573 | 0.95 | 19000 | 0.1432 | 7.6657 | | 0.0441 | 0.975 | 19500 | 0.1408 | 7.1001 | | 0.0453 | 1.0 | 20000 | 0.1407 | 7.1001 | ### Framework versions - Transformers 4.49.0.dev0 - Pytorch 2.6.0+cu124 - Datasets 3.2.1.dev0 - Tokenizers 0.21.0