Finetune Whisper on Frisian and English
Collection
Assessing Knowledge-Distillation Based Compression of Whisper Model for Frisian ASR
•
12 items
•
Updated
This model is a fine-tuned version of distil-small.en on the librispeech dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.651 | 0.5556 | 100 | 0.9641 | 3.4754 |
0.5006 | 1.1111 | 200 | 0.7651 | 3.5039 |
0.3531 | 1.6667 | 300 | 0.5188 | 3.5121 |
0.2176 | 2.2222 | 400 | 0.3514 | 4.0258 |
0.1834 | 2.7778 | 500 | 0.2878 | 4.3132 |
0.1587 | 3.3333 | 600 | 0.2589 | 4.4049 |
0.1553 | 3.8889 | 700 | 0.2447 | 4.5007 |
0.1566 | 4.4444 | 800 | 0.2370 | 4.5007 |
0.1226 | 5.0 | 900 | 0.2332 | 4.5048 |
0.1533 | 5.5556 | 1000 | 0.2318 | 4.4905 |