wav2vec2-xls-r-300m-Fleurs_AMMI_AFRIVOICE_LRSC-ln-1hrs-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8424
  • Wer: 0.7133
  • Cer: 0.2385

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
7.9233 1.0 39 2.9633 1.0 1.0
2.8667 2.0 78 2.8788 1.0 1.0
2.9406 3.0 117 2.8635 1.0 1.0
2.8363 4.0 156 2.8408 1.0 1.0
2.8262 5.0 195 2.8456 1.0 1.0
2.84 6.0 234 2.8458 1.0 1.0
2.8195 7.0 273 2.8430 1.0 1.0
2.8204 8.0 312 2.8129 1.0 1.0
2.804 9.0 351 2.8006 1.0 1.0
2.7869 10.0 390 2.8017 1.0 1.0
2.7856 11.0 429 2.7912 1.0 1.0
2.765 12.0 468 2.7441 1.0 1.0
2.7333 13.0 507 2.6953 1.0 1.0
2.6093 14.0 546 2.5490 1.0 1.0
2.4857 15.0 585 2.3470 1.0 1.0
2.2251 16.0 624 2.0507 1.0 0.6812
1.9855 17.0 663 1.8561 1.0 0.6490
1.7317 18.0 702 1.6075 1.0 0.5509
1.5049 19.0 741 1.5265 0.9993 0.5068
1.4006 20.0 780 1.4114 0.9426 0.4247
1.2416 21.0 819 1.3780 0.9512 0.4098
1.1441 22.0 858 1.3123 0.8908 0.3474
1.0138 23.0 897 1.2678 0.8808 0.3405
0.9221 24.0 936 1.2555 0.8739 0.3365
0.8665 25.0 975 1.2717 0.8634 0.3380
0.8189 26.0 1014 1.1927 0.8557 0.3150
0.712 27.0 1053 1.2620 0.8260 0.3013
0.6586 28.0 1092 1.2498 0.8221 0.3015
0.6092 29.0 1131 1.1750 0.8127 0.2934
0.5484 30.0 1170 1.2835 0.8137 0.2887
0.504 31.0 1209 1.2157 0.8175 0.2967
0.5021 32.0 1248 1.2737 0.8099 0.2937
0.4412 33.0 1287 1.2819 0.7946 0.2831
0.3965 34.0 1326 1.3634 0.8043 0.2827
0.4001 35.0 1365 1.3748 0.8146 0.2922
0.3519 36.0 1404 1.3540 0.8053 0.2777
0.36 37.0 1443 1.3906 0.7995 0.2891
0.3035 38.0 1482 1.4942 0.8291 0.2811
0.311 39.0 1521 1.4371 0.7835 0.2798
0.2896 40.0 1560 1.4310 0.7827 0.2698
0.2588 41.0 1599 1.5232 0.7824 0.2760
0.2495 42.0 1638 1.4698 0.7726 0.2663
0.2527 43.0 1677 1.4726 0.7628 0.2705
0.221 44.0 1716 1.5491 0.7570 0.2659
0.2203 45.0 1755 1.5355 0.7699 0.2687
0.2133 46.0 1794 1.5278 0.8099 0.2722
0.2062 47.0 1833 1.5152 0.7555 0.2605
0.1936 48.0 1872 1.5899 0.7680 0.2680
0.1865 49.0 1911 1.5933 0.7626 0.2610
0.2022 50.0 1950 1.6041 0.7534 0.2579
0.1913 51.0 1989 1.5845 0.7476 0.2597
0.1784 52.0 2028 1.6421 0.7620 0.2628
0.1696 53.0 2067 1.7008 0.7569 0.2619
0.156 54.0 2106 1.6896 0.7616 0.2593
0.1424 55.0 2145 1.6266 0.7510 0.2597
0.1474 56.0 2184 1.7046 0.7394 0.2545
0.147 57.0 2223 1.7005 0.7419 0.2566
0.1752 58.0 2262 1.6940 0.7476 0.2533
0.1338 59.0 2301 1.7020 0.7447 0.2529
0.1352 60.0 2340 1.6386 0.7406 0.2528
0.1164 61.0 2379 1.7665 0.7637 0.2544
0.1307 62.0 2418 1.7711 0.7396 0.2513
0.117 63.0 2457 1.7694 0.7296 0.2479
0.1196 64.0 2496 1.7590 0.7368 0.2506
0.1125 65.0 2535 1.7932 0.7405 0.2493
0.1155 66.0 2574 1.7361 0.7390 0.2490
0.123 67.0 2613 1.7788 0.7343 0.2498
0.1163 68.0 2652 1.7992 0.7275 0.2476
0.1075 69.0 2691 1.7817 0.7366 0.2483
0.1007 70.0 2730 1.7929 0.7397 0.2504
0.0987 71.0 2769 1.8354 0.7320 0.2491
0.1026 72.0 2808 1.8047 0.7266 0.2463
0.1038 73.0 2847 1.8150 0.7319 0.2482
0.1004 74.0 2886 1.8303 0.7244 0.2446
0.0983 75.0 2925 1.7693 0.7256 0.2456
0.0961 76.0 2964 1.7943 0.7177 0.2432
0.1034 77.0 3003 1.8095 0.7270 0.2450
0.1 78.0 3042 1.8075 0.7348 0.2444
0.0988 79.0 3081 1.7991 0.7226 0.2431
0.0867 80.0 3120 1.8438 0.7316 0.2441
0.0951 81.0 3159 1.8312 0.7284 0.2430
0.103 82.0 3198 1.8002 0.7208 0.2419
0.104 83.0 3237 1.8522 0.7229 0.2444
0.0874 84.0 3276 1.8207 0.7222 0.2416
0.087 85.0 3315 1.8349 0.7176 0.2400
0.0861 86.0 3354 1.8445 0.7229 0.2405
0.0845 87.0 3393 1.8139 0.7173 0.2393
0.0841 88.0 3432 1.8183 0.7160 0.2406
0.0723 89.0 3471 1.8424 0.7151 0.2406
0.0791 90.0 3510 1.8458 0.7187 0.2410
0.0802 91.0 3549 1.8563 0.7205 0.2403
0.0871 92.0 3588 1.8370 0.7124 0.2391
0.0852 93.0 3627 1.8476 0.7163 0.2389
0.0821 94.0 3666 1.8523 0.7130 0.2387
0.0841 95.0 3705 1.8473 0.7139 0.2385
0.0746 96.0 3744 1.8415 0.7130 0.2384
0.0888 97.0 3783 1.8423 0.7133 0.2383
0.124 97.4416 3800 1.8424 0.7133 0.2385

Framework versions

  • Transformers 4.48.2
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
18
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for asr-africa/wav2vec2-xls-r-300m-Fleurs_AMMI_AFRIVOICE_LRSC-ln-1hrs-v1

Finetuned
(584)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-Fleurs_AMMI_AFRIVOICE_LRSC-ln-1hrs-v1