Edit model card

scream_small_beta

This model is a fine-tuned version of openai/whisper-small on the NbAiLab/ncc_speech dataset. It achieves the following results on the evaluation set:

  • step: 24999
  • validation_fleurs_loss: 0.3973
  • train_loss: 0.6021
  • validation_fleurs_wer: 9.3099
  • validation_fleurs_cer: 4.1971
  • validation_fleurs_exact_wer: 12.8734
  • validation_fleurs_exact_cer: 5.2187
  • validation_stortinget_loss: 0.2984
  • validation_stortinget_wer: 14.6028
  • validation_stortinget_cer: 10.4391
  • validation_stortinget_exact_wer: 18.0343
  • validation_stortinget_exact_cer: 11.0177
  • validation_nrk_tv_loss: 0.7428
  • validation_nrk_tv_wer: 42.2159
  • validation_nrk_tv_cer: 32.6028
  • validation_nrk_tv_exact_wer: 49.9168
  • validation_nrk_tv_exact_cer: 33.8385

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • lr_scheduler_type: linear
  • per_device_train_batch_size: 32
  • total_train_batch_size_per_node: 128
  • total_train_batch_size: 1024
  • total_optimization_steps: 25,000
  • starting_optimization_step: None
  • finishing_optimization_step: 25,000
  • num_train_dataset_workers: 32
  • num_hosts: 8
  • total_num_training_examples: 25,600,000
  • steps_per_epoch: 6259
  • num_beams: None
  • dropout: True
  • bpe_dropout_probability: 0.1

Training results

step validation_fleurs_loss train_loss validation_fleurs_wer validation_fleurs_cer validation_fleurs_exact_wer validation_fleurs_exact_cer validation_stortinget_loss validation_stortinget_wer validation_stortinget_cer validation_stortinget_exact_wer validation_stortinget_exact_cer validation_nrk_tv_loss validation_nrk_tv_wer validation_nrk_tv_cer validation_nrk_tv_exact_wer validation_nrk_tv_exact_cer
0 1.2013 2.7117 32.3914 9.8343 35.7228 10.9398 1.4988 44.0673 22.9444 48.2612 24.2595 1.8165 79.9390 54.6020 89.7612 56.8482
1000 0.5796 1.0147 16.1214 5.2624 19.9821 6.2962 0.4822 22.0502 13.3652 25.7586 14.0827 1.0170 51.9187 37.4011 59.7853 39.0187
2000 0.4483 0.8851 12.4628 4.6064 16.2485 5.6101 0.3988 18.2903 11.9625 21.9050 12.6098 0.9032 46.8241 34.8122 55.1298 36.2314
3000 0.4130 0.8246 11.6002 4.7445 15.4122 5.7357 0.3602 16.9068 11.3683 20.4599 11.9897 0.8434 46.9972 35.4892 54.8885 36.8431
4000 0.3946 0.7897 10.2617 4.2365 14.4564 5.1703 0.3359 16.1132 11.0146 19.5868 11.6112 0.8112 44.8580 33.8810 52.6086 35.2519
5000 0.4532 0.7438 10.3807 4.2809 14.1876 5.2090 0.3295 15.7676 10.8729 19.2134 11.4603 0.8051 44.2068 33.3323 51.3438 34.6898
6000 0.4496 0.7275 10.1725 4.1182 13.9785 5.1075 0.3247 15.3487 10.6600 18.8008 11.2647 0.8003 43.8399 33.1808 51.5810 34.5430
7000 0.4061 0.7164 10.0535 4.4190 13.8292 5.3829 0.3183 15.0975 10.5465 18.5450 11.1334 0.7788 43.4813 33.2975 51.5227 34.6075
8000 0.3531 0.7066 9.4587 4.0590 13.2616 4.9915 0.3088 15.0711 10.5598 18.4922 11.1406 0.7575 43.4318 33.3995 51.1192 34.7187
9000 0.3529 0.6867 10.0833 4.2612 14.2174 5.3684 0.3107 14.8659 10.4674 18.3762 11.0681 0.7651 41.5811 31.9552 49.5507 33.2483
10000 0.4180 0.6707 9.3099 4.2711 13.0526 5.2621 0.3090 14.9093 10.4711 18.3745 11.0540 0.7626 42.2530 32.3733 49.9251 33.5916
11000 0.3910 0.6874 9.7561 4.4881 13.9188 5.5859 0.3046 15.1792 10.7022 18.5484 11.2682 0.7605 42.7847 32.9575 50.4368 34.1589
12000 0.4032 0.6411 9.9048 4.4239 13.5006 5.4409 0.3052 14.7986 10.3955 18.1349 10.9770 0.7578 42.5663 32.5781 50.1914 33.8385
13000 0.3947 0.6516 8.9827 4.1132 13.2915 5.2283 0.3060 14.7458 10.3560 18.1673 10.9388 0.7544 42.6569 32.8756 50.4493 34.1344
14000 0.3708 0.6618 9.6669 4.5177 13.4110 5.5762 0.3027 14.6471 10.4162 18.0343 10.9920 0.7516 42.3519 32.5672 49.7296 33.8119
15000 0.3782 0.6338 9.1612 4.0787 12.8734 5.0302 0.3026 14.6079 10.3933 17.9934 10.9722 0.7488 41.9933 32.2473 49.6214 33.4455
16000 0.4045 0.6512 9.3694 4.0047 13.0526 4.9915 0.3029 14.7007 10.4022 18.1273 10.9836 0.7533 42.2118 32.2921 50.0582 33.7370
17000 0.3916 0.6043 9.6669 4.4535 13.7097 5.4603 0.3020 14.6692 10.3537 18.0318 10.9227 0.7495 41.5935 31.9892 49.3177 33.2683
18000 0.3801 0.6113 10.2023 4.4634 13.8889 5.4699 0.3018 14.6496 10.3509 18.1085 10.9501 0.7493 41.8037 32.1082 49.6131 33.3899
19000 0.3911 0.6281 9.9048 4.4486 13.7097 5.5037 0.3000 14.6471 10.4582 18.0505 11.0356 0.7500 42.3684 32.4992 50.2829 33.9090
20000 0.4000 0.6169 9.3694 4.1034 13.2915 5.1510 0.3027 14.5424 10.3978 17.9056 10.9631 0.7474 41.9603 32.3076 49.7712 33.5486
21000 0.3887 0.6214 9.3397 4.3450 13.0227 5.3491 0.3013 14.7092 10.4834 18.1537 11.0670 0.7455 41.9274 32.4389 49.5299 33.6591
22000 0.4062 0.6133 9.7858 4.4387 13.5305 5.4506 0.3024 14.7263 10.5176 18.1980 11.1049 0.7493 41.7089 31.9536 49.5299 33.2446
23000 0.4156 0.5867 9.5479 4.3549 13.2019 5.3926 0.3017 14.7246 10.5767 18.1358 11.1493 0.7478 41.7831 32.1577 49.6713 33.4337
24000 0.4056 0.5913 9.7264 4.4486 13.2915 5.4651 0.3000 14.7016 10.5556 18.1068 11.1303 0.7458 42.1335 32.2488 49.7213 33.4604

Framework versions

  • Transformers 4.31.0.dev0
  • Datasets 2.13.0
  • Tokenizers 0.13.3
Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for NbAiLabArchive/scream_small_beta

Finetuned
(1931)
this model