wav2vec2-large-robust-paper

This model is a fine-tuned version of facebook/wav2vec2-large-robust on the HTS98/ORIGINAL_VER1.2 - NA dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8696
  • Wer: 0.4572

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 10
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 420
  • num_epochs: 50.0

Training results

Training Loss Epoch Step Validation Loss Wer
No log 1.0 335 3.9163 1.0
7.1369 2.0 670 3.3422 1.0
3.3448 3.0 1005 3.3355 1.0
3.3448 4.0 1340 3.3263 1.0
3.3277 5.0 1675 2.8928 1.0079
2.6655 6.0 2010 1.7822 0.8788
2.6655 7.0 2345 1.3193 0.7055
1.4617 8.0 2680 1.1408 0.6070
1.0805 9.0 3015 1.0108 0.5422
1.0805 10.0 3350 0.9517 0.5154
0.8759 11.0 3685 0.9082 0.4902
0.7462 12.0 4020 0.8758 0.4706
0.7462 13.0 4355 0.8696 0.4572
0.6429 14.0 4690 0.8731 0.4535
0.5672 15.0 5025 0.8749 0.4508
0.5672 16.0 5360 0.8753 0.4512
0.4959 17.0 5695 0.9039 0.4487
0.4456 18.0 6030 0.9161 0.4433
0.4456 19.0 6365 0.9506 0.4430
0.392 20.0 6700 0.9412 0.4439
0.3594 21.0 7035 0.9884 0.4416
0.3594 22.0 7370 1.0222 0.4510
0.3175 23.0 7705 1.0345 0.4439
0.2947 24.0 8040 1.0849 0.4465
0.2947 25.0 8375 1.0879 0.4472
0.2674 26.0 8710 1.1071 0.4512
0.2521 27.0 9045 1.1147 0.4494
0.2521 28.0 9380 1.1426 0.4525
0.2321 29.0 9715 1.1592 0.4440
0.2235 30.0 10050 1.1782 0.4450
0.2235 31.0 10385 1.2050 0.4437
0.2071 32.0 10720 1.2224 0.4400
0.1951 33.0 11055 1.2270 0.4471
0.1951 34.0 11390 1.2466 0.4483
0.1892 35.0 11725 1.2325 0.4429
0.1809 36.0 12060 1.2755 0.4427
0.1809 37.0 12395 1.2675 0.4422
0.1746 38.0 12730 1.3022 0.4418
0.1656 39.0 13065 1.3179 0.4408
0.1656 40.0 13400 1.2934 0.4425
0.1614 41.0 13735 1.3304 0.4426
0.1564 42.0 14070 1.3148 0.4420
0.1564 43.0 14405 1.3267 0.4433
0.1546 44.0 14740 1.3331 0.4413
0.1515 45.0 15075 1.3445 0.4388
0.1515 46.0 15410 1.3530 0.4372
0.147 47.0 15745 1.3443 0.4385
0.1447 48.0 16080 1.3503 0.4369
0.1447 49.0 16415 1.3590 0.4393
0.1437 50.0 16750 1.3668 0.4372

Framework versions

  • Transformers 4.31.0.dev0
  • Pytorch 2.0.0+cu117
  • Datasets 2.7.0
  • Tokenizers 0.13.2
Downloads last month
27
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.