Whisper Large V2
This model is a fine-tuned version of openai/whisper-large-v2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3388
- Wer: 15.6433
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 12
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 20
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.5957 | 0.0460 | 15 | 0.4410 | 43.4046 |
0.3883 | 0.0920 | 30 | 0.3673 | 22.6793 |
0.3351 | 0.1380 | 45 | 0.3537 | 24.6421 |
0.3154 | 0.1840 | 60 | 0.3425 | 25.5344 |
0.3082 | 0.2301 | 75 | 0.3368 | 29.2401 |
0.2981 | 0.2761 | 90 | 0.3236 | 22.7442 |
0.2828 | 0.3221 | 105 | 0.3184 | 25.2616 |
0.3101 | 0.3681 | 120 | 0.3168 | 18.5215 |
0.3008 | 0.4141 | 135 | 0.3162 | 19.1299 |
0.3016 | 0.4601 | 150 | 0.3065 | 23.6178 |
0.3093 | 0.5061 | 165 | 0.3099 | 18.7668 |
0.3004 | 0.5521 | 180 | 0.3043 | 20.1939 |
0.2837 | 0.5982 | 195 | 0.3053 | 22.6716 |
0.2927 | 0.6442 | 210 | 0.2989 | 19.6217 |
0.2705 | 0.6902 | 225 | 0.2962 | 20.1708 |
0.2916 | 0.7362 | 240 | 0.2904 | 19.4490 |
0.275 | 0.7822 | 255 | 0.2936 | 18.2101 |
0.2631 | 0.8282 | 270 | 0.2894 | 18.1221 |
0.2582 | 0.8742 | 285 | 0.2885 | 21.2325 |
0.2482 | 0.9202 | 300 | 0.2944 | 17.5918 |
0.2675 | 0.9663 | 315 | 0.2876 | 26.0615 |
0.2324 | 1.0123 | 330 | 0.2833 | 20.3391 |
0.1474 | 1.0583 | 345 | 0.2872 | 20.8177 |
0.1524 | 1.1043 | 360 | 0.2831 | 19.1409 |
0.1506 | 1.1503 | 375 | 0.2829 | 17.8338 |
0.1572 | 1.1963 | 390 | 0.2841 | 17.8151 |
0.1478 | 1.2423 | 405 | 0.2798 | 15.7544 |
0.1426 | 1.2883 | 420 | 0.2781 | 17.4455 |
0.1458 | 1.3344 | 435 | 0.2817 | 21.0058 |
0.131 | 1.3804 | 450 | 0.2856 | 18.7790 |
0.1307 | 1.4264 | 465 | 0.2841 | 17.2848 |
0.1541 | 1.4724 | 480 | 0.2838 | 15.8820 |
0.1417 | 1.5184 | 495 | 0.2900 | 19.0276 |
0.128 | 1.5644 | 510 | 0.2877 | 17.3684 |
0.1538 | 1.6104 | 525 | 0.2748 | 17.0010 |
0.1223 | 1.6564 | 540 | 0.2768 | 18.1177 |
0.127 | 1.7025 | 555 | 0.2754 | 18.2926 |
0.1336 | 1.7485 | 570 | 0.2746 | 19.0507 |
0.1411 | 1.7945 | 585 | 0.2724 | 16.7644 |
0.1318 | 1.8405 | 600 | 0.2729 | 17.3156 |
0.1491 | 1.8865 | 615 | 0.2708 | 17.7282 |
0.1284 | 1.9325 | 630 | 0.2720 | 15.0931 |
0.1237 | 1.9785 | 645 | 0.2674 | 17.0087 |
0.113 | 2.0245 | 660 | 0.2808 | 17.8030 |
0.0696 | 2.0706 | 675 | 0.2846 | 16.9426 |
0.0751 | 2.1166 | 690 | 0.2830 | 15.9029 |
0.071 | 2.1626 | 705 | 0.2837 | 16.7622 |
0.071 | 2.2086 | 720 | 0.2905 | 19.0826 |
0.071 | 2.2546 | 735 | 0.2818 | 20.2808 |
0.0591 | 2.3006 | 750 | 0.2850 | 16.9217 |
0.057 | 2.3466 | 765 | 0.2844 | 15.5530 |
0.068 | 2.3926 | 780 | 0.2772 | 16.7105 |
0.0736 | 2.4387 | 795 | 0.2784 | 14.5430 |
0.067 | 2.4847 | 810 | 0.2839 | 15.2582 |
0.0716 | 2.5307 | 825 | 0.2794 | 18.2013 |
0.0761 | 2.5767 | 840 | 0.2754 | 15.0271 |
0.0686 | 2.6227 | 855 | 0.2775 | 15.3385 |
0.0724 | 2.6687 | 870 | 0.2775 | 15.1779 |
0.0702 | 2.7147 | 885 | 0.2805 | 18.0418 |
0.0654 | 2.7607 | 900 | 0.2811 | 16.0889 |
0.0719 | 2.8067 | 915 | 0.2802 | 15.6246 |
0.0738 | 2.8528 | 930 | 0.2742 | 16.8755 |
0.0593 | 2.8988 | 945 | 0.2810 | 15.6345 |
0.062 | 2.9448 | 960 | 0.2750 | 14.8610 |
0.0702 | 2.9908 | 975 | 0.2751 | 15.1316 |
0.0458 | 3.0368 | 990 | 0.2896 | 14.5958 |
0.0304 | 3.0828 | 1005 | 0.3012 | 18.4544 |
0.0327 | 3.1288 | 1020 | 0.2996 | 18.2343 |
0.0321 | 3.1748 | 1035 | 0.2937 | 15.0667 |
0.0292 | 3.2209 | 1050 | 0.2989 | 14.5760 |
0.0285 | 3.2669 | 1065 | 0.3009 | 21.4988 |
0.027 | 3.3129 | 1080 | 0.3014 | 15.0469 |
0.0257 | 3.3589 | 1095 | 0.2979 | 16.9371 |
0.0338 | 3.4049 | 1110 | 0.2928 | 16.9922 |
0.0298 | 3.4509 | 1125 | 0.3017 | 17.3508 |
0.024 | 3.4969 | 1140 | 0.3006 | 15.2098 |
0.0281 | 3.5429 | 1155 | 0.2994 | 15.4507 |
0.0256 | 3.5890 | 1170 | 0.2994 | 14.5023 |
0.0229 | 3.6350 | 1185 | 0.3007 | 15.9777 |
0.0336 | 3.6810 | 1200 | 0.3005 | 16.0393 |
0.0262 | 3.7270 | 1215 | 0.3028 | 15.3539 |
0.0254 | 3.7730 | 1230 | 0.2965 | 15.2923 |
0.0297 | 3.8190 | 1245 | 0.2968 | 15.2318 |
0.0244 | 3.8650 | 1260 | 0.3017 | 15.7203 |
0.0254 | 3.9110 | 1275 | 0.3008 | 15.3858 |
0.0297 | 3.9571 | 1290 | 0.2945 | 16.1384 |
0.0216 | 4.0031 | 1305 | 0.2965 | 14.6816 |
0.0105 | 4.0491 | 1320 | 0.3202 | 14.2581 |
0.0112 | 4.0951 | 1335 | 0.3319 | 14.1689 |
0.0107 | 4.1411 | 1350 | 0.3256 | 14.2437 |
0.0091 | 4.1871 | 1365 | 0.3261 | 14.3560 |
0.0082 | 4.2331 | 1380 | 0.3325 | 14.2735 |
0.0096 | 4.2791 | 1395 | 0.3356 | 15.0887 |
0.0107 | 4.3252 | 1410 | 0.3372 | 14.5980 |
0.0087 | 4.3712 | 1425 | 0.3399 | 14.7697 |
0.0114 | 4.4172 | 1440 | 0.3387 | 15.6224 |
0.0069 | 4.4632 | 1455 | 0.3371 | 15.2032 |
0.0075 | 4.5092 | 1470 | 0.3384 | 15.5563 |
0.0076 | 4.5552 | 1485 | 0.3375 | 15.8842 |
0.0061 | 4.6012 | 1500 | 0.3389 | 15.6213 |
0.0068 | 4.6472 | 1515 | 0.3404 | 15.4518 |
0.0095 | 4.6933 | 1530 | 0.3373 | 15.3594 |
0.0093 | 4.7393 | 1545 | 0.3353 | 15.5156 |
0.0098 | 4.7853 | 1560 | 0.3367 | 15.7368 |
0.0072 | 4.8313 | 1575 | 0.3374 | 15.9799 |
0.0062 | 4.8773 | 1590 | 0.3389 | 15.6719 |
0.0072 | 4.9233 | 1605 | 0.3392 | 15.7841 |
0.0089 | 4.9693 | 1620 | 0.3388 | 15.6433 |
Framework versions
- Transformers 4.44.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 31
Model tree for golesheed/whisper-v2-Hollandic_WestFrisian_WestUtrecht
Base model
openai/whisper-large-v2