whisper-tiny-final / README.md
Cafet's picture
Update README.md
0a5e4cc verified
|
raw
history blame
2.36 kB
metadata
license: apache-2.0
base_model: openai/whisper-tiny
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: whisper-tiny-final
    results: []

whisper-tiny-final

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0714
  • Wer: 6.3947

Model description

Step Training Loss Validation Loss Wer 1000 0.727300 0.734777 71.347666 2000 0.392000 0.430395 52.059163 3000 0.317100 0.305939 39.781162 4000 0.206400 0.225029 30.785726 5000 0.152800 0.169434 23.076923 6000 0.119000 0.130408 16.517293 7000 0.082300 0.102279 11.755650 8000 0.079600 0.085155 8.511574 9000 0.051400 0.075068 7.048991 10000 0.045000 0.071429 6.394678

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • training_steps: 10000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.7273 1.6051 1000 0.7348 71.3477
0.392 3.2103 2000 0.4304 52.0592
0.3171 4.8154 3000 0.3059 39.7812
0.2064 6.4205 4000 0.2250 30.7857
0.1528 8.0257 5000 0.1694 23.0769
0.119 9.6308 6000 0.1304 16.5173
0.0823 11.2360 7000 0.1023 11.7556
0.0796 12.8411 8000 0.0852 8.5116
0.0514 14.4462 9000 0.0751 7.0490
0.045 16.0514 10000 0.0714 6.3947

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1