Edit model card

whisper-small-nomimo

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Wer: 1.9149

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.3211 1.3889 100 0.2361 12.7660
0.7584 2.7778 200 0.1198 12.5532
0.2285 4.1667 300 0.1448 158.5106
0.1734 5.5556 400 0.1582 27.0213
0.1351 6.9444 500 0.0599 11.2766
0.1252 8.3333 600 0.0704 52.9787
0.086 9.7222 700 0.1452 17.2340
0.0865 11.1111 800 0.0404 54.0426
0.0635 12.5 900 0.0221 49.7872
0.0672 13.8889 1000 0.0403 52.9787
0.034 15.2778 1100 0.0459 7.0213
0.0396 16.6667 1200 0.0205 3.8298
0.0262 18.0556 1300 0.0147 5.1064
0.0227 19.4444 1400 0.0101 48.2979
0.0108 20.8333 1500 0.0019 47.0213
0.0044 22.2222 1600 0.0013 1.7021
0.0021 23.6111 1700 0.0001 1.9149
0.0002 25.0 1800 0.0001 1.9149
0.0002 26.3889 1900 0.0000 1.9149
0.0002 27.7778 2000 0.0000 1.9149
0.0001 29.1667 2100 0.0000 1.9149

Framework versions

  • Transformers 4.45.0.dev0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for susmitabhatt/whisper-small-nomimo

Finetuned
this model