Edit model card

whisper_4_with_init_sun_char_0035

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 1.8933
  • Train Accuracy: 0.0432
  • Train Wermet: 0.0682
  • Validation Loss: 1.8761
  • Validation Accuracy: 0.0295
  • Validation Wermet: 0.1173
  • Epoch: 34

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
3.2071 0.0313 0.1237 2.8546 0.0225 0.1109 0
3.0365 0.0325 0.0375 2.8115 0.0228 0.1215 1
3.0162 0.0326 0.0484 2.7884 0.0231 0.1318 2
3.0042 0.0327 0.0555 2.7853 0.0233 0.1393 3
2.9934 0.0328 0.0614 2.7657 0.0232 0.1273 4
2.9858 0.0329 0.0654 2.7542 0.0234 0.1073 5
2.9735 0.0330 0.0673 2.7367 0.0234 0.1414 6
2.9574 0.0332 0.0704 2.6961 0.0240 0.1429 7
2.9320 0.0335 0.0723 2.6652 0.0239 0.0990 8
2.8976 0.0339 0.0729 2.5997 0.0245 0.0944 9
2.8460 0.0343 0.0728 2.5378 0.0248 0.1435 10
2.7781 0.0347 0.0741 2.4355 0.0254 0.1372 11
2.7083 0.0352 0.0747 2.5163 0.0248 0.0987 12
2.6445 0.0356 0.0720 2.2997 0.0261 0.1484 13
2.5838 0.0360 0.0724 2.2386 0.0266 0.1419 14
2.5294 0.0363 0.0721 2.1855 0.0269 0.1289 15
2.4760 0.0367 0.0711 2.1682 0.0271 0.1214 16
2.4339 0.0370 0.0698 2.1018 0.0273 0.1264 17
2.3867 0.0373 0.0684 2.0647 0.0275 0.1403 18
2.3528 0.0376 0.0669 2.0705 0.0275 0.1089 19
2.3145 0.0379 0.0658 2.0179 0.0280 0.1209 20
2.2765 0.0382 0.0654 2.0182 0.0279 0.1023 21
2.2415 0.0385 0.0650 1.9558 0.0284 0.1523 22
2.2102 0.0388 0.0643 1.9395 0.0285 0.1123 23
2.1717 0.0392 0.0635 1.9791 0.0282 0.0928 24
2.1457 0.0395 0.0626 1.8907 0.0291 0.1078 25
2.1159 0.0398 0.0633 1.8930 0.0290 0.1098 26
2.0892 0.0401 0.0638 1.8696 0.0292 0.1078 27
2.0609 0.0405 0.0659 1.8555 0.0296 0.1051 28
2.0342 0.0409 0.0639 1.8589 0.0293 0.1092 29
2.0044 0.0413 0.0653 1.8375 0.0299 0.1015 30
1.9831 0.0416 0.0649 1.7954 0.0302 0.1194 31
1.9535 0.0421 0.0689 1.7937 0.0302 0.1168 32
1.9290 0.0425 0.0706 1.8385 0.0299 0.1074 33
1.8933 0.0432 0.0682 1.8761 0.0295 0.1173 34

Framework versions

  • Transformers 4.34.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3
Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bigmorning/whisper_4_with_init_sun_char_0035

Finetuned
(1206)
this model