Edit model card

wav2vec2-xls-r-300m-lg-CV-414hrs-v10

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3742
  • Wer: 0.1098
  • Cer: 0.0343

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 70
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.4484 1.0 32306 0.3944 0.3772 0.0986
0.2653 2.0 64612 0.3131 0.3193 0.0836
0.226 3.0 96918 0.3028 0.2965 0.0784
0.2019 4.0 129224 0.2732 0.2778 0.0721
0.1849 5.0 161530 0.2768 0.2626 0.0685
0.1712 6.0 193836 0.2824 0.2594 0.0694
0.1589 7.0 226142 0.2490 0.2504 0.0654
0.1496 8.0 258448 0.2502 0.2323 0.0621
0.1407 9.0 290754 0.2438 0.2273 0.0598
0.1333 10.0 323060 0.2510 0.2239 0.0613
0.1265 11.0 355366 0.2317 0.2227 0.0594
0.12 12.0 387672 0.2839 0.2147 0.0579
0.1137 13.0 419978 0.2698 0.2073 0.0570
0.108 14.0 452284 0.2360 0.2055 0.0554
0.1026 15.0 484590 0.2390 0.1995 0.0552
0.0975 16.0 516896 0.2494 0.1951 0.0530
0.0925 17.0 549202 0.2465 0.1925 0.0542
0.088 18.0 581508 0.2313 0.1870 0.0530
0.0832 19.0 613814 0.2468 0.1856 0.0513
0.0794 20.0 646120 0.2410 0.1905 0.0513
0.0753 21.0 678426 0.2430 0.1787 0.0492
0.0721 22.0 710732 0.2373 0.1774 0.0498
0.0686 23.0 743038 0.2618 0.1774 0.0501
0.0651 24.0 775344 0.2468 0.1698 0.0487
0.0618 25.0 807650 0.2813 0.1660 0.0477
0.0594 26.0 839956 0.2716 0.1674 0.0470
0.0565 27.0 872262 0.2525 0.1584 0.0460
0.0539 28.0 904568 0.2778 0.1596 0.0466
0.0523 29.0 936874 0.2628 0.1588 0.0450
0.0502 30.0 969180 0.2699 0.1543 0.0456
0.0481 31.0 1001486 0.2680 0.1517 0.0446
0.0461 32.0 1033792 0.3047 0.1581 0.0460
0.0444 33.0 1066098 0.2902 0.1553 0.0444
0.0427 34.0 1098404 0.2601 0.1506 0.0432
0.0412 35.0 1130710 0.2839 0.1483 0.0428
0.0397 36.0 1163016 0.2759 0.1387 0.0418
0.0383 37.0 1195322 0.2770 0.1501 0.0427
0.0371 38.0 1227628 0.2885 0.1424 0.0423
0.036 39.0 1259934 0.2883 0.1421 0.0418
0.0346 40.0 1292240 0.2915 0.1383 0.0414
0.0337 41.0 1324546 0.2842 0.1390 0.0408
0.0324 42.0 1356852 0.3078 0.1356 0.0412
0.0313 43.0 1389158 0.3268 0.1375 0.0401
0.0306 44.0 1421464 0.3019 0.1364 0.0411
0.0295 45.0 1453770 0.3077 0.1351 0.0401
0.0284 46.0 1486076 0.3226 0.1352 0.0404
0.0274 47.0 1518382 0.2993 0.1335 0.0401
0.0264 48.0 1550688 0.3139 0.1302 0.0384
0.0256 49.0 1582994 0.3338 0.1331 0.0397
0.025 50.0 1615300 0.3323 0.1260 0.0388
0.024 51.0 1647606 0.3184 0.1291 0.0390
0.0233 52.0 1679912 0.3239 0.1308 0.0385
0.0226 53.0 1712218 0.3308 0.1250 0.0377
0.0218 54.0 1744524 0.3498 0.1296 0.0384
0.0212 55.0 1776830 0.3390 0.1246 0.0374
0.0204 56.0 1809136 0.3439 0.1259 0.0379
0.0197 57.0 1841442 0.3490 0.1227 0.0374
0.019 58.0 1873748 0.3406 0.1228 0.0382
0.0184 59.0 1906054 0.3401 0.1213 0.0374
0.0179 60.0 1938360 0.3523 0.1206 0.0366
0.017 61.0 1970666 0.3844 0.1200 0.0360
0.0165 62.0 2002972 0.3582 0.1183 0.0356
0.0162 63.0 2035278 0.3759 0.1163 0.0356
0.0155 64.0 2067584 0.3641 0.1162 0.0363
0.015 65.0 2099890 0.3634 0.1168 0.0345
0.0145 66.0 2132196 0.3739 0.1140 0.0348
0.014 67.0 2164502 0.3708 0.1137 0.0347
0.0137 68.0 2196808 0.3715 0.1139 0.0347
0.0136 69.0 2229114 0.3787 0.1114 0.0345
0.0131 70.0 2261420 0.3742 0.1098 0.0343

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
33
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-lg-CV-414hrs-v10

Finetuned
(458)
this model