whisper_nosp_0005 / README.md
bigmorning's picture
Upload TFWhisperForConditionalGeneration
df575df
|
raw
history blame
1.79 kB
metadata
license: apache-2.0
tags:
  - generated_from_keras_callback
model-index:
  - name: whisper_nosp_0005
    results: []

whisper_nosp_0005

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 1.9349
  • Train Accuracy: 0.0157
  • Validation Loss: 1.6630
  • Validation Accuracy: 0.0172
  • Epoch: 4

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Validation Loss Validation Accuracy Epoch
7.5559 0.0010 6.3853 0.0013 0
6.3227 0.0021 5.7023 0.0038 1
4.9825 0.0063 3.6302 0.0109 2
2.9413 0.0126 2.1959 0.0154 3
1.9349 0.0157 1.6630 0.0172 4

Framework versions

  • Transformers 4.25.0.dev0
  • TensorFlow 2.9.2
  • Datasets 2.6.1
  • Tokenizers 0.13.2