You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Wav2Vec2-BERT Yoruba - Alvin Nahabwe

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the NaijaVoices dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4907
  • Wer: 0.0874
  • Cer: 0.0340

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 9e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • distributed_type: multi-GPU
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.025
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Cer Validation Loss Wer
0.5689 0.9999 4112 0.1388 0.4763 0.3693
0.4554 2.0 8225 0.1344 0.4470 0.3544
0.4257 2.9999 12337 0.1282 0.4219 0.3409
0.3978 4.0 16450 0.1242 0.4085 0.3316
0.3781 4.9999 20562 0.1236 0.4062 0.3308
0.3565 6.0 24675 0.1123 0.3751 0.3043
0.3373 6.9999 28787 0.1089 0.3612 0.2949
0.3145 8.0 32900 0.1043 0.3491 0.2856
0.291 8.9999 37012 0.0994 0.3338 0.2744
0.2678 10.0 41125 0.0955 0.3186 0.2648
0.2431 10.9999 45237 0.0916 0.3028 0.2540
0.218 12.0 49350 0.0869 0.2954 0.2417
0.1971 12.9999 53462 0.0838 0.2869 0.2361
0.1783 14.0 57575 0.0780 0.2731 0.2191
0.1584 14.9999 61687 0.0746 0.2688 0.2102
0.1433 16.0 65800 0.0731 0.2651 0.2063
0.1332 16.9999 69912 0.0701 0.2635 0.1966
0.1198 18.0 74025 0.0676 0.2621 0.1890
0.1095 18.9999 78137 0.0653 0.2527 0.1833
0.1 20.0 82250 0.0621 0.2574 0.1741
0.0907 20.9999 86362 0.0599 0.2468 0.1681
0.0839 22.0 90475 0.0596 0.2538 0.1658
0.0773 22.9999 94587 0.0577 0.2583 0.1610
0.0707 24.0 98700 0.0574 0.2665 0.1593
0.0653 24.9999 102812 0.0547 0.2603 0.1520
0.0608 26.0 106925 0.0554 0.2552 0.1539
0.0568 26.9999 111037 0.0530 0.2542 0.1482
0.0524 28.0 115150 0.0534 0.2657 0.1473
0.0492 28.9999 119262 0.0511 0.2602 0.1412
0.0466 30.0 123375 0.0510 0.2683 0.1395
0.0444 30.9999 127487 0.0499 0.2655 0.1376
0.042 31.9999 131584 0.2597 0.1357 0.0496
0.0391 33.0 135697 0.2659 0.1313 0.0480
0.0368 33.9999 139809 0.2773 0.1309 0.0477
0.0351 35.0 143922 0.2765 0.1347 0.0493
0.0328 35.9999 148034 0.2719 0.1283 0.0468
0.0313 37.0 152147 0.2771 0.1300 0.0476
0.0299 37.9999 156259 0.2929 0.1257 0.0460
0.0297 39.0 160372 0.2875 0.1227 0.0452
0.0272 39.9999 164484 0.2868 0.1275 0.0469
0.0261 41.0 168597 0.2815 0.1236 0.0456
0.0243 41.9999 172709 0.2978 0.1189 0.0440
0.0235 43.0 176822 0.2861 0.1199 0.0446
0.0217 43.9999 180934 0.3017 0.1181 0.0441
0.022 45.0 185047 0.2931 0.1264 0.0464
0.0203 45.9999 189159 0.2927 0.1154 0.0429
0.0195 47.0 193272 0.3027 0.1164 0.0433
0.0188 47.9999 197384 0.2897 0.1166 0.0433
0.0176 49.0 201497 0.2972 0.1150 0.0429
0.017 49.9999 205609 0.3118 0.1120 0.0420
0.0163 51.0 209722 0.3039 0.1106 0.0413
0.0153 51.9999 213834 0.3159 0.1109 0.0417
0.0144 53.0 217947 0.3212 0.1094 0.0412
0.0137 53.9999 222059 0.3149 0.1092 0.0411
0.0134 55.0 226172 0.3175 0.1092 0.0409
0.0129 55.9999 230284 0.3109 0.1090 0.0409
0.0123 57.0 234397 0.3288 0.1092 0.0411
0.0117 57.9999 238509 0.3146 0.1087 0.0409
0.0112 59.0 242622 0.3229 0.1066 0.0403
0.0105 59.9999 246734 0.3361 0.1056 0.0398
0.0103 61.0 250847 0.3275 0.1048 0.0398
0.0094 61.9999 254959 0.3288 0.1045 0.0396
0.009 63.0 259072 0.3374 0.1031 0.0390
0.0088 63.9999 263184 0.3377 0.1022 0.0387
0.0083 65.0 267297 0.3361 0.1028 0.0391
0.008 65.9999 271409 0.3426 0.1014 0.0385
0.0075 67.0 275522 0.3559 0.1017 0.0385
0.0072 67.9999 279634 0.3510 0.1002 0.0380
0.0069 69.0 283747 0.3526 0.1001 0.0382
0.0062 69.9999 287859 0.3483 0.0992 0.0378
0.0061 71.0 291972 0.3532 0.0982 0.0376
0.0056 71.9999 296084 0.3558 0.1004 0.0382
0.0053 73.0 300197 0.3585 0.0973 0.0373
0.0052 73.9999 304309 0.3628 0.0984 0.0377
0.0047 75.0 308422 0.3711 0.0965 0.0370
0.0048 75.9999 312534 0.3699 0.0993 0.0378
0.0044 77.0 316647 0.3777 0.0952 0.0367
0.0038 77.9999 320759 0.3789 0.0957 0.0369
0.0037 79.0 324872 0.3963 0.0963 0.0370
0.0036 79.9999 328984 0.3818 0.0950 0.0364
0.0036 81.0 333097 0.3865 0.0949 0.0365
0.003 81.9999 337209 0.4016 0.0932 0.0361
0.0029 83.0 341322 0.4033 0.0929 0.0360
0.0025 83.9999 345434 0.4045 0.0933 0.0362
0.0024 85.0 349547 0.4054 0.0927 0.0358
0.0022 85.9999 353659 0.4087 0.0919 0.0356
0.0019 87.0 357772 0.4182 0.0914 0.0353
0.0018 87.9999 361884 0.4122 0.0912 0.0353
0.0016 89.0 365997 0.4184 0.0910 0.0353
0.0015 89.9999 370109 0.4297 0.0910 0.0353
0.0021 91.0 374222 0.4329 0.0893 0.0348
0.0011 91.9999 378334 0.4480 0.0900 0.0350
0.0009 93.0 382447 0.4460 0.0896 0.0351
0.0009 93.9999 386559 0.4454 0.0891 0.0348
0.0007 95.0 390672 0.4578 0.0885 0.0345
0.0006 95.9999 394784 0.4609 0.0889 0.0347
0.0005 97.0 398897 0.4765 0.0883 0.0345
0.0004 97.9999 403009 0.4804 0.0879 0.0343
0.0003 99.0 407122 0.4870 0.0877 0.0341
0.0002 99.9916 411200 0.4907 0.0874 0.0340

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.6B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Alvin-Nahabwe/asr-africa-w2v-bert-2.0-naijavoices-yoruba-500hr-v0

Finetuned
(417)
this model

Dataset used to train Alvin-Nahabwe/asr-africa-w2v-bert-2.0-naijavoices-yoruba-500hr-v0

Collection including Alvin-Nahabwe/asr-africa-w2v-bert-2.0-naijavoices-yoruba-500hr-v0

Evaluation results