whisper_havest_0010 / README.md
bigmorning's picture
Upload TFWhisperForConditionalGeneration
07fd83a
metadata
license: apache-2.0
tags:
  - generated_from_keras_callback
model-index:
  - name: whisper_havest_0010
    results: []

whisper_havest_0010

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 5.1222
  • Train Accuracy: 0.0117
  • Train Do Wer: 1.0
  • Validation Loss: 5.1600
  • Validation Accuracy: 0.0117
  • Validation Do Wer: 1.0
  • Epoch: 9

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Do Wer Validation Loss Validation Accuracy Validation Do Wer Epoch
9.9191 0.0046 1.0 8.5836 0.0067 1.0 0
8.0709 0.0083 1.0 7.4667 0.0089 1.0 1
7.1652 0.0100 1.0 6.8204 0.0112 1.0 2
6.7196 0.0114 1.0 6.5192 0.0114 1.0 3
6.4115 0.0115 1.0 6.2357 0.0115 1.0 4
6.1085 0.0115 1.0 5.9657 0.0115 1.0 5
5.8206 0.0115 1.0 5.7162 0.0115 1.0 6
5.5567 0.0115 1.0 5.4963 0.0115 1.0 7
5.3223 0.0116 1.0 5.3096 0.0116 1.0 8
5.1222 0.0117 1.0 5.1600 0.0117 1.0 9

Framework versions

  • Transformers 4.25.0.dev0
  • TensorFlow 2.9.2
  • Datasets 2.6.1
  • Tokenizers 0.13.2