xtreme_s_xlsr_300m_fleurs_langid
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the GOOGLE/XTREME_S - FLEURS.ALL dataset. It achieves the following results on the evaluation set:
- Accuracy: 0.7271
- Accuracy Af Za: 0.3865
- Accuracy Am Et: 0.8818
- Accuracy Ar Eg: 0.9977
- Accuracy As In: 0.9858
- Accuracy Ast Es: 0.8362
- Accuracy Az Az: 0.8386
- Accuracy Be By: 0.4085
- Accuracy Bn In: 0.9989
- Accuracy Bs Ba: 0.2508
- Accuracy Ca Es: 0.6947
- Accuracy Ceb Ph: 0.9852
- Accuracy Cmn Hans Cn: 0.9799
- Accuracy Cs Cz: 0.5353
- Accuracy Cy Gb: 0.9716
- Accuracy Da Dk: 0.6688
- Accuracy De De: 0.7807
- Accuracy El Gr: 0.7692
- Accuracy En Us: 0.9815
- Accuracy Es 419: 0.9846
- Accuracy Et Ee: 0.5230
- Accuracy Fa Ir: 0.8462
- Accuracy Ff Sn: 0.2348
- Accuracy Fi Fi: 0.9978
- Accuracy Fil Ph: 0.9564
- Accuracy Fr Fr: 0.9852
- Accuracy Ga Ie: 0.8468
- Accuracy Gl Es: 0.5016
- Accuracy Gu In: 0.973
- Accuracy Ha Ng: 0.9163
- Accuracy He Il: 0.8043
- Accuracy Hi In: 0.9354
- Accuracy Hr Hr: 0.3654
- Accuracy Hu Hu: 0.8044
- Accuracy Hy Am: 0.9914
- Accuracy Id Id: 0.9869
- Accuracy Ig Ng: 0.9360
- Accuracy Is Is: 0.0217
- Accuracy It It: 0.8
- Accuracy Ja Jp: 0.7385
- Accuracy Jv Id: 0.5824
- Accuracy Ka Ge: 0.8611
- Accuracy Kam Ke: 0.4184
- Accuracy Kea Cv: 0.8692
- Accuracy Kk Kz: 0.8727
- Accuracy Km Kh: 0.7030
- Accuracy Kn In: 0.9630
- Accuracy Ko Kr: 0.9843
- Accuracy Ku Arab Iq: 0.9577
- Accuracy Ky Kg: 0.8936
- Accuracy Lb Lu: 0.8897
- Accuracy Lg Ug: 0.9253
- Accuracy Ln Cd: 0.9644
- Accuracy Lo La: 0.1580
- Accuracy Lt Lt: 0.4686
- Accuracy Luo Ke: 0.9922
- Accuracy Lv Lv: 0.6498
- Accuracy Mi Nz: 0.9613
- Accuracy Mk Mk: 0.7636
- Accuracy Ml In: 0.6962
- Accuracy Mn Mn: 0.8462
- Accuracy Mr In: 0.3911
- Accuracy Ms My: 0.3632
- Accuracy Mt Mt: 0.6188
- Accuracy My Mm: 0.9705
- Accuracy Nb No: 0.6891
- Accuracy Ne Np: 0.8994
- Accuracy Nl Nl: 0.9093
- Accuracy Nso Za: 0.8873
- Accuracy Ny Mw: 0.4691
- Accuracy Oci Fr: 0.1533
- Accuracy Om Et: 0.9512
- Accuracy Or In: 0.5447
- Accuracy Pa In: 0.8153
- Accuracy Pl Pl: 0.7757
- Accuracy Ps Af: 0.8105
- Accuracy Pt Br: 0.7715
- Accuracy Ro Ro: 0.4122
- Accuracy Ru Ru: 0.9794
- Accuracy Rup Bg: 0.9468
- Accuracy Sd Arab In: 0.5245
- Accuracy Sk Sk: 0.8624
- Accuracy Sl Si: 0.0300
- Accuracy Sn Zw: 0.8843
- Accuracy So So: 0.8803
- Accuracy Sr Rs: 0.0257
- Accuracy Sv Se: 0.0145
- Accuracy Sw Ke: 0.9199
- Accuracy Ta In: 0.9526
- Accuracy Te In: 0.9788
- Accuracy Tg Tj: 0.9883
- Accuracy Th Th: 0.9912
- Accuracy Tr Tr: 0.7887
- Accuracy Uk Ua: 0.0627
- Accuracy Umb Ao: 0.7863
- Accuracy Ur Pk: 0.0134
- Accuracy Uz Uz: 0.4014
- Accuracy Vi Vn: 0.7246
- Accuracy Wo Sn: 0.4555
- Accuracy Xh Za: 1.0
- Accuracy Yo Ng: 0.7353
- Accuracy Yue Hant Hk: 0.7985
- Accuracy Zu Za: 0.4696
- Loss: 1.3789
- Loss Af Za: 2.6778
- Loss Am Et: 0.4615
- Loss Ar Eg: 0.0149
- Loss As In: 0.0764
- Loss Ast Es: 0.4560
- Loss Az Az: 0.5677
- Loss Be By: 1.9231
- Loss Bn In: 0.0024
- Loss Bs Ba: 2.4954
- Loss Ca Es: 1.2632
- Loss Ceb Ph: 0.0426
- Loss Cmn Hans Cn: 0.0650
- Loss Cs Cz: 1.9334
- Loss Cy Gb: 0.1274
- Loss Da Dk: 1.4990
- Loss De De: 0.8820
- Loss El Gr: 0.9839
- Loss En Us: 0.0827
- Loss Es 419: 0.0516
- Loss Et Ee: 1.9264
- Loss Fa Ir: 0.6520
- Loss Ff Sn: 5.4283
- Loss Fi Fi: 0.0109
- Loss Fil Ph: 0.1706
- Loss Fr Fr: 0.0591
- Loss Ga Ie: 0.5174
- Loss Gl Es: 1.2657
- Loss Gu In: 0.0850
- Loss Ha Ng: 0.3234
- Loss He Il: 0.8299
- Loss Hi In: 0.4190
- Loss Hr Hr: 2.9754
- Loss Hu Hu: 0.8345
- Loss Hy Am: 0.0329
- Loss Id Id: 0.0529
- Loss Ig Ng: 0.2523
- Loss Is Is: 6.5153
- Loss It It: 0.8113
- Loss Ja Jp: 1.3968
- Loss Jv Id: 2.0009
- Loss Ka Ge: 0.6162
- Loss Kam Ke: 2.2192
- Loss Kea Cv: 0.5567
- Loss Kk Kz: 0.5592
- Loss Km Kh: 1.7358
- Loss Kn In: 0.1063
- Loss Ko Kr: 0.1519
- Loss Ku Arab Iq: 0.2075
- Loss Ky Kg: 0.4639
- Loss Lb Lu: 0.4454
- Loss Lg Ug: 0.3764
- Loss Ln Cd: 0.1844
- Loss Lo La: 3.8051
- Loss Lt Lt: 2.5054
- Loss Luo Ke: 0.0479
- Loss Lv Lv: 1.3713
- Loss Mi Nz: 0.1390
- Loss Mk Mk: 0.7952
- Loss Ml In: 1.2999
- Loss Mn Mn: 0.7621
- Loss Mr In: 3.7056
- Loss Ms My: 3.0192
- Loss Mt Mt: 1.5520
- Loss My Mm: 0.1514
- Loss Nb No: 1.1194
- Loss Ne Np: 0.4231
- Loss Nl Nl: 0.3291
- Loss Nso Za: 0.5106
- Loss Ny Mw: 2.7346
- Loss Oci Fr: 5.0983
- Loss Om Et: 0.2297
- Loss Or In: 2.5432
- Loss Pa In: 0.7753
- Loss Pl Pl: 0.7309
- Loss Ps Af: 1.0454
- Loss Pt Br: 0.9782
- Loss Ro Ro: 3.5829
- Loss Ru Ru: 0.0598
- Loss Rup Bg: 0.1695
- Loss Sd Arab In: 2.6198
- Loss Sk Sk: 0.5583
- Loss Sl Si: 6.0923
- Loss Sn Zw: 0.4465
- Loss So So: 0.4492
- Loss Sr Rs: 4.7575
- Loss Sv Se: 6.5858
- Loss Sw Ke: 0.4235
- Loss Ta In: 0.1818
- Loss Te In: 0.0808
- Loss Tg Tj: 0.0912
- Loss Th Th: 0.0462
- Loss Tr Tr: 0.7340
- Loss Uk Ua: 4.6777
- Loss Umb Ao: 1.4021
- Loss Ur Pk: 8.4067
- Loss Uz Uz: 4.3297
- Loss Vi Vn: 1.1304
- Loss Wo Sn: 2.2281
- Loss Xh Za: 0.0009
- Loss Yo Ng: 1.3345
- Loss Yue Hant Hk: 1.0728
- Loss Zu Za: 3.7279
- Predict Samples: 77960
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 1
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- total_train_batch_size: 64
- total_eval_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 5.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Accuracy | Validation Loss |
---|---|---|---|---|
0.5296 | 0.26 | 1000 | 0.4016 | 2.6633 |
0.4252 | 0.52 | 2000 | 0.5751 | 1.8582 |
0.2989 | 0.78 | 3000 | 0.6332 | 1.6780 |
0.3563 | 1.04 | 4000 | 0.6799 | 1.4479 |
0.1617 | 1.3 | 5000 | 0.6679 | 1.5066 |
0.1409 | 1.56 | 6000 | 0.6992 | 1.4082 |
0.01 | 1.82 | 7000 | 0.7071 | 1.2448 |
0.0018 | 2.08 | 8000 | 0.7148 | 1.1996 |
0.0014 | 2.34 | 9000 | 0.6410 | 1.6505 |
0.0188 | 2.6 | 10000 | 0.6840 | 1.4050 |
0.0007 | 2.86 | 11000 | 0.6621 | 1.5831 |
0.1038 | 3.12 | 12000 | 0.6829 | 1.5441 |
0.0003 | 3.38 | 13000 | 0.6900 | 1.3483 |
0.0004 | 3.64 | 14000 | 0.6414 | 1.7070 |
0.0003 | 3.9 | 15000 | 0.7075 | 1.3198 |
0.0002 | 4.16 | 16000 | 0.7105 | 1.3118 |
0.0001 | 4.42 | 17000 | 0.7029 | 1.4099 |
0.0 | 4.68 | 18000 | 0.7180 | 1.3658 |
0.0001 | 4.93 | 19000 | 0.7236 | 1.3514 |
Framework versions
- Transformers 4.18.0.dev0
- Pytorch 1.10.1+cu111
- Datasets 1.18.4.dev0
- Tokenizers 0.11.6
- Downloads last month
- 47
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.