Whisper Large v3 Fine-Tuned Finnish
This model is a fine-tuned version of openai/whisper-large-v3 on the mozilla-foundation/common_voice_13_0 dataset. It achieves the following results on the evaluation set:
- Loss: 0.3970
- Wer: 27.1397
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- training_steps: 1600
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.7157 | 0.21 | 50 | 0.4892 | 42.8216 |
0.6602 | 0.42 | 100 | 0.6444 | 58.3379 |
0.5952 | 0.63 | 150 | 0.6134 | 52.3560 |
0.5649 | 0.84 | 200 | 0.5645 | 53.3499 |
0.5078 | 1.05 | 250 | 0.5867 | 69.4736 |
0.289 | 1.26 | 300 | 0.6150 | 55.5034 |
0.3318 | 1.47 | 350 | 0.5309 | 40.3000 |
0.2859 | 1.68 | 400 | 0.5462 | 41.3584 |
0.3091 | 1.89 | 450 | 0.4891 | 38.6159 |
0.2089 | 2.11 | 500 | 0.5305 | 41.0915 |
0.148 | 2.32 | 550 | 0.5124 | 37.8888 |
0.1538 | 2.53 | 600 | 0.5075 | 36.9225 |
0.1515 | 2.74 | 650 | 0.5187 | 37.9440 |
0.1532 | 2.95 | 700 | 0.4666 | 35.4040 |
0.0982 | 3.16 | 750 | 0.4934 | 36.7385 |
0.0832 | 3.37 | 800 | 0.4796 | 34.3641 |
0.0782 | 3.58 | 850 | 0.4742 | 39.5270 |
0.0724 | 3.79 | 900 | 0.4634 | 42.7388 |
0.0733 | 4.0 | 950 | 0.4544 | 38.6987 |
0.0409 | 4.21 | 1000 | 0.4778 | 33.1953 |
0.0437 | 4.42 | 1050 | 0.4584 | 33.3793 |
0.036 | 4.63 | 1100 | 0.4476 | 33.9223 |
0.0321 | 4.84 | 1150 | 0.4538 | 32.0725 |
0.0257 | 5.05 | 1200 | 0.4390 | 37.2538 |
0.0133 | 5.26 | 1250 | 0.4319 | 30.7289 |
0.0104 | 5.47 | 1300 | 0.4065 | 30.5264 |
0.0122 | 5.68 | 1350 | 0.4012 | 28.5754 |
0.0071 | 5.89 | 1400 | 0.4113 | 28.4925 |
0.0038 | 6.11 | 1450 | 0.3939 | 26.8820 |
0.0019 | 6.32 | 1500 | 0.3960 | 26.8820 |
0.0021 | 6.53 | 1550 | 0.3978 | 27.1305 |
0.0017 | 6.74 | 1600 | 0.3970 | 27.1397 |
Framework versions
- Transformers 4.37.0.dev0
- Pytorch 2.0.1
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for enakilci/whisper-large-v3-fi-1600steps-8batch-2grad_steps-0.0001lr
Base model
openai/whisper-large-v3Dataset used to train enakilci/whisper-large-v3-fi-1600steps-8batch-2grad_steps-0.0001lr
Evaluation results
- Wer on mozilla-foundation/common_voice_13_0self-reported27.140