Finetune Whisper on Frisian and English
Collection
Assessing Knowledge-Distillation Based Compression of Whisper Model for Frisian ASR
•
12 items
•
Updated
This model is a fine-tuned version of distil-small.en on the librispeech dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.5641 | 33.3333 | 100 | 0.9641 | 3.4754 |
0.3271 | 66.6667 | 200 | 0.7822 | 3.4652 |
0.0871 | 100.0 | 300 | 0.5731 | 3.4530 |
0.0149 | 133.3333 | 400 | 0.5142 | 3.4774 |
0.0043 | 166.6667 | 500 | 0.5051 | 3.5345 |
0.0026 | 200.0 | 600 | 0.5030 | 3.5569 |
0.002 | 233.3333 | 700 | 0.5020 | 3.5671 |
0.0016 | 266.6667 | 800 | 0.5015 | 3.5773 |
0.0014 | 300.0 | 900 | 0.5013 | 3.5936 |
0.0014 | 333.3333 | 1000 | 0.5012 | 3.5814 |