test-wav2vec2-darija-marocain-aya-29000
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 54.1290
- Wer: 1.0
- Cer: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|---|---|---|---|---|---|
| 27.3086 | 0.0364 | 500 | 54.1290 | 1.0 | 1.0 |
| 10.8536 | 0.0728 | 1000 | 25.7240 | 1.0 | 1.0 |
| 5.6544 | 0.1091 | 1500 | 8.7983 | 1.0 | 1.0 |
| 4.0069 | 0.1455 | 2000 | 4.8197 | 1.0 | 1.0 |
| 3.6432 | 0.1819 | 2500 | 4.4943 | 1.0 | 1.0 |
| 3.477 | 0.2183 | 3000 | 4.4012 | 1.0 | 1.0 |
| 3.4495 | 0.2546 | 3500 | 3.9120 | 1.0 | 1.0 |
| 3.3688 | 0.2910 | 4000 | 3.7199 | 1.0 | 1.0 |
| 3.2459 | 0.3274 | 4500 | 3.5998 | 1.0 | 1.0 |
| 3.2401 | 0.3638 | 5000 | 3.5128 | 1.0 | 1.0 |
| 3.4504 | 0.4001 | 5500 | 3.3765 | 1.0 | 1.0 |
| 3.142 | 0.4365 | 6000 | 3.1360 | 1.0 | 1.0 |
| 3.0648 | 0.4729 | 6500 | 2.8708 | 1.0 | 0.9982 |
| 2.8215 | 0.5093 | 7000 | 2.5534 | 1.0 | 0.9350 |
| 2.5345 | 0.5457 | 7500 | 2.2860 | 0.9995 | 0.8187 |
| 2.5151 | 0.5820 | 8000 | 2.0754 | 0.9968 | 0.6671 |
| 2.3124 | 0.6184 | 8500 | 1.8855 | 0.9969 | 0.6009 |
| 2.0244 | 0.6548 | 9000 | 1.7425 | 0.9939 | 0.5501 |
| 2.1518 | 0.6912 | 9500 | 1.6193 | 0.9880 | 0.5155 |
| 2.1723 | 0.7275 | 10000 | 1.5097 | 0.9844 | 0.4980 |
| 1.9636 | 0.7639 | 10500 | 1.4213 | 0.9727 | 0.4651 |
| 1.7568 | 0.8003 | 11000 | 1.3623 | 0.9549 | 0.4281 |
| 1.7105 | 0.8367 | 11500 | 1.3058 | 0.9296 | 0.4012 |
| 1.739 | 0.8730 | 12000 | 1.2470 | 0.9163 | 0.3894 |
| 1.6762 | 0.9094 | 12500 | 1.2088 | 0.9095 | 0.3864 |
| 1.5635 | 0.9458 | 13000 | 1.1740 | 0.9033 | 0.3727 |
| 1.3595 | 0.9822 | 13500 | 1.1522 | 0.8795 | 0.3645 |
| 1.8905 | 1.0186 | 14000 | 1.1483 | 0.8725 | 0.3605 |
| 1.5877 | 1.0549 | 14500 | 1.0925 | 0.8755 | 0.3568 |
| 1.8459 | 1.0913 | 15000 | 1.0628 | 0.8630 | 0.3521 |
| 1.4473 | 1.1277 | 15500 | 1.1005 | 0.8560 | 0.3515 |
| 1.7997 | 1.1641 | 16000 | 1.0273 | 0.8666 | 0.3528 |
| 1.5638 | 1.2004 | 16500 | 1.0178 | 0.8514 | 0.3440 |
| 1.6806 | 1.2368 | 17000 | 1.0114 | 0.8373 | 0.3387 |
| 1.5214 | 1.2732 | 17500 | 1.0049 | 0.8307 | 0.3361 |
| 1.409 | 1.3096 | 18000 | 0.9952 | 0.8325 | 0.3355 |
| 1.4618 | 1.3459 | 18500 | 0.9835 | 0.8305 | 0.3318 |
| 1.5613 | 1.3823 | 19000 | 0.9587 | 0.8306 | 0.3355 |
| 1.8607 | 1.4187 | 19500 | 0.9556 | 0.8257 | 0.3328 |
| 1.8506 | 1.4551 | 20000 | 0.9594 | 0.8187 | 0.3279 |
| 1.4933 | 1.4915 | 20500 | 0.9329 | 0.8134 | 0.3262 |
| 1.7531 | 1.5278 | 21000 | 0.9182 | 0.8087 | 0.3232 |
| 1.2786 | 1.5642 | 21500 | 0.9241 | 0.8100 | 0.3218 |
| 1.5976 | 1.6006 | 22000 | 0.9095 | 0.8075 | 0.3213 |
| 1.5346 | 1.6370 | 22500 | 0.9155 | 0.8010 | 0.3188 |
| 1.4204 | 1.6733 | 23000 | 0.9010 | 0.8070 | 0.3185 |
| 1.509 | 1.7097 | 23500 | 0.8933 | 0.8010 | 0.3168 |
| 1.1191 | 1.7461 | 24000 | 0.8943 | 0.7944 | 0.3156 |
| 1.7361 | 1.7825 | 24500 | 0.8802 | 0.8008 | 0.3178 |
| 1.3272 | 1.8188 | 25000 | 0.8761 | 0.7988 | 0.3156 |
| 1.4057 | 1.8552 | 25500 | 0.8705 | 0.7906 | 0.3130 |
| 1.5288 | 1.8916 | 26000 | 0.8595 | 0.7869 | 0.3116 |
| 1.6068 | 1.9280 | 26500 | 0.8633 | 0.7906 | 0.3118 |
| 1.9591 | 1.9644 | 27000 | 0.8562 | 0.7847 | 0.3112 |
| 1.6623 | 2.0007 | 27500 | 0.8536 | 0.7818 | 0.3094 |
| 2.2612 | 2.0371 | 28000 | 0.8462 | 0.7820 | 0.3074 |
| 1.0876 | 2.0735 | 28500 | 0.8511 | 0.7757 | 0.3068 |
| 1.3043 | 2.1099 | 29000 | 0.8464 | 0.7847 | 0.3102 |
| 1.1624 | 2.1462 | 29500 | 0.8520 | 0.7746 | 0.3064 |
| 2.6973 | 2.1826 | 30000 | 0.8365 | 0.7752 | 0.3069 |
| 1.3618 | 2.2190 | 30500 | 0.8316 | 0.7729 | 0.3053 |
| 1.384 | 2.2554 | 31000 | 0.8355 | 0.7670 | 0.3029 |
| 1.4903 | 2.2917 | 31500 | 0.8334 | 0.7691 | 0.3035 |
| 1.6345 | 2.3281 | 32000 | 0.8350 | 0.7715 | 0.3040 |
| 1.0043 | 2.3645 | 32500 | 0.8278 | 0.7659 | 0.3021 |
| 1.3774 | 2.4009 | 33000 | 0.8278 | 0.7569 | 0.2989 |
| 1.2622 | 2.4372 | 33500 | 0.8215 | 0.7606 | 0.3004 |
| 1.4779 | 2.4736 | 34000 | 0.8180 | 0.7602 | 0.2988 |
| 1.2156 | 2.5100 | 34500 | 0.8157 | 0.7583 | 0.2985 |
| 1.2055 | 2.5464 | 35000 | 0.8189 | 0.7565 | 0.2975 |
| 0.9888 | 2.5828 | 35500 | 0.8204 | 0.7502 | 0.2957 |
| 1.2773 | 2.6191 | 36000 | 0.8126 | 0.7545 | 0.2968 |
| 1.5654 | 2.6555 | 36500 | 0.8133 | 0.7544 | 0.2972 |
| 1.1398 | 2.6919 | 37000 | 0.8137 | 0.7520 | 0.2959 |
| 1.1038 | 2.7283 | 37500 | 0.8075 | 0.7579 | 0.2969 |
| 1.3804 | 2.7646 | 38000 | 0.8101 | 0.7542 | 0.2958 |
| 1.281 | 2.8010 | 38500 | 0.8084 | 0.7536 | 0.2958 |
| 1.2316 | 2.8374 | 39000 | 0.8087 | 0.7551 | 0.2959 |
| 1.5158 | 2.8738 | 39500 | 0.8071 | 0.7555 | 0.2963 |
| 1.0682 | 2.9101 | 40000 | 0.8074 | 0.7542 | 0.2954 |
| 1.5609 | 2.9465 | 40500 | 0.8055 | 0.7539 | 0.2953 |
| 1.193 | 2.9829 | 41000 | 0.8058 | 0.7539 | 0.2956 |
Framework versions
- Transformers 4.41.0
- Pytorch 2.4.1+cu124
- Datasets 2.18.0
- Tokenizers 0.19.1
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Datasmartly/test-wav2vec2-darija-marocain-aya-29000
Base model
facebook/wav2vec2-xls-r-300m