metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper-tiny-akan
results: []
whisper-tiny-akan
This model is a fine-tuned version of openai/whisper-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.1096
- Wer: 45.1603
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 2000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.4793 | 10.0 | 250 | 0.7459 | 53.4739 |
0.0732 | 20.0 | 500 | 0.9086 | 49.4656 |
0.0309 | 30.0 | 750 | 1.0036 | 47.3278 |
0.0132 | 40.0 | 1000 | 1.0760 | 46.8230 |
0.005 | 50.0 | 1250 | 1.0944 | 45.3088 |
0.002 | 60.0 | 1500 | 1.0899 | 44.5368 |
0.0006 | 70.0 | 1750 | 1.1071 | 45.0416 |
0.0005 | 80.0 | 2000 | 1.1096 | 45.1603 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1