Edit model card

whisper_4_with_init_sun_char_0075

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 1.2803
  • Train Accuracy: 0.0606
  • Train Wermet: 0.5457
  • Validation Loss: 2.0848
  • Validation Accuracy: 0.0316
  • Validation Wermet: 0.9665
  • Epoch: 74

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
3.2071 0.0313 0.1237 2.8546 0.0225 0.1109 0
3.0365 0.0325 0.0375 2.8115 0.0228 0.1215 1
3.0162 0.0326 0.0484 2.7884 0.0231 0.1318 2
3.0042 0.0327 0.0555 2.7853 0.0233 0.1393 3
2.9934 0.0328 0.0614 2.7657 0.0232 0.1273 4
2.9858 0.0329 0.0654 2.7542 0.0234 0.1073 5
2.9735 0.0330 0.0673 2.7367 0.0234 0.1414 6
2.9574 0.0332 0.0704 2.6961 0.0240 0.1429 7
2.9320 0.0335 0.0723 2.6652 0.0239 0.0990 8
2.8976 0.0339 0.0729 2.5997 0.0245 0.0944 9
2.8460 0.0343 0.0728 2.5378 0.0248 0.1435 10
2.7781 0.0347 0.0741 2.4355 0.0254 0.1372 11
2.7083 0.0352 0.0747 2.5163 0.0248 0.0987 12
2.6445 0.0356 0.0720 2.2997 0.0261 0.1484 13
2.5838 0.0360 0.0724 2.2386 0.0266 0.1419 14
2.5294 0.0363 0.0721 2.1855 0.0269 0.1289 15
2.4760 0.0367 0.0711 2.1682 0.0271 0.1214 16
2.4339 0.0370 0.0698 2.1018 0.0273 0.1264 17
2.3867 0.0373 0.0684 2.0647 0.0275 0.1403 18
2.3528 0.0376 0.0669 2.0705 0.0275 0.1089 19
2.3145 0.0379 0.0658 2.0179 0.0280 0.1209 20
2.2765 0.0382 0.0654 2.0182 0.0279 0.1023 21
2.2415 0.0385 0.0650 1.9558 0.0284 0.1523 22
2.2102 0.0388 0.0643 1.9395 0.0285 0.1123 23
2.1717 0.0392 0.0635 1.9791 0.0282 0.0928 24
2.1457 0.0395 0.0626 1.8907 0.0291 0.1078 25
2.1159 0.0398 0.0633 1.8930 0.0290 0.1098 26
2.0892 0.0401 0.0638 1.8696 0.0292 0.1078 27
2.0609 0.0405 0.0659 1.8555 0.0296 0.1051 28
2.0342 0.0409 0.0639 1.8589 0.0293 0.1092 29
2.0044 0.0413 0.0653 1.8375 0.0299 0.1015 30
1.9831 0.0416 0.0649 1.7954 0.0302 0.1194 31
1.9535 0.0421 0.0689 1.7937 0.0302 0.1168 32
1.9290 0.0425 0.0706 1.8385 0.0299 0.1074 33
1.8933 0.0432 0.0682 1.8761 0.0295 0.1173 34
1.8724 0.0435 0.0752 1.7929 0.0304 0.1220 35
1.8407 0.0442 0.0760 1.7865 0.0306 0.1266 36
1.8179 0.0446 0.0832 1.8108 0.0304 0.1226 37
1.7977 0.0451 0.0888 1.8024 0.0306 0.1161 38
1.7846 0.0454 0.0855 1.8107 0.0305 0.1385 39
1.7516 0.0461 0.0922 1.8258 0.0307 0.1365 40
1.7358 0.0465 0.1070 1.8837 0.0302 0.1461 41
1.7036 0.0474 0.1106 1.8589 0.0306 0.1201 42
1.6779 0.0481 0.1052 1.8831 0.0305 0.1755 43
1.6539 0.0487 0.1192 1.8249 0.0309 0.1901 44
1.6500 0.0488 0.1149 1.8435 0.0310 0.1313 45
1.6401 0.0490 0.1468 1.8509 0.0310 0.1597 46
1.6232 0.0495 0.1443 1.8573 0.0310 0.1588 47
1.5947 0.0503 0.1315 1.8350 0.0311 0.1476 48
1.5659 0.0512 0.1890 1.8934 0.0310 0.1507 49
1.5409 0.0521 0.1410 1.9782 0.0299 0.1663 50
1.5417 0.0520 0.1805 1.9223 0.0309 0.2287 51
1.5330 0.0522 0.1907 1.9174 0.0313 0.2481 52
1.5182 0.0527 0.1963 1.9254 0.0312 0.1440 53
1.5008 0.0532 0.2386 1.9368 0.0309 0.2045 54
1.4700 0.0543 0.2347 1.9171 0.0310 0.3189 55
1.4517 0.0549 0.2159 1.9880 0.0308 0.4000 56
1.4421 0.0553 0.2616 1.9647 0.0310 0.3311 57
1.4393 0.0552 0.2959 1.9191 0.0314 0.3403 58
1.4163 0.0560 0.3296 2.0068 0.0313 0.3711 59
1.4174 0.0559 0.3499 2.0338 0.0310 0.2981 60
1.4112 0.0561 0.3553 2.0262 0.0312 0.3595 61
1.3840 0.0572 0.4110 1.9913 0.0313 0.2975 62
1.3662 0.0578 0.3471 2.0969 0.0307 0.2794 63
1.3596 0.0579 0.3211 2.0164 0.0314 0.9982 64
1.3819 0.0571 0.3542 1.9052 0.0315 0.9802 65
1.3823 0.0569 0.3757 1.9371 0.0315 1.0860 66
1.3364 0.0587 0.4048 2.0912 0.0311 0.2807 67
1.3494 0.0582 0.3723 1.9475 0.0317 0.3295 68
1.3321 0.0587 0.3546 2.1066 0.0314 0.6181 69
1.3198 0.0592 0.4076 2.0759 0.0314 0.4974 70
1.2896 0.0603 0.4556 1.9717 0.0316 0.7519 71
1.2842 0.0604 0.5363 2.0598 0.0315 0.5596 72
1.2841 0.0604 0.5000 1.9914 0.0314 0.5531 73
1.2803 0.0606 0.5457 2.0848 0.0316 0.9665 74

Framework versions

  • Transformers 4.34.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bigmorning/whisper_4_with_init_sun_char_0075

Finetuned
(1206)
this model