DeBERTa commited on
Commit
4d9496a
1 Parent(s): 4d6e70c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -25,10 +25,10 @@ The mDeBERTa V3 base model comes with 12 layers and a hidden size of 768. Its to
25
 
26
  We present the dev results on XNLI with zero-shot crosslingual transfer setting, i.e. training with english data only, test with other languages.
27
 
28
- | Model | en | fr| es | de | el | bg | ru |tr |ar |vi | th | zh | hi | sw | ur | avg |
29
- |--------------|----|----|---- |-- |-- |-- | -- |-- |-- |-- | -- | -- | -- | -- | -- | ----|
30
- | XLM-R-base |85.8|79.7|80.7 |78.7 |77.5 |79.6 |78.1 |74.2 |73.8 |76.5 |74.6 |76.7| 72.4| 66.5| 68.3|75.6 |
31
- | mDeBERTa-base|88.2|82.6|84.4 |82.7 |82.3 |82.4 |80.8 |79.5 |78.5 |78.1 |76.4 |79.5| 75.9| 73.9| 72.4|**79.8**+/-0.2|
32
 
33
  #### Fine-tuning with HF transformers
34
 
 
25
 
26
  We present the dev results on XNLI with zero-shot crosslingual transfer setting, i.e. training with english data only, test with other languages.
27
 
28
+ | Model |avg | en | fr| es | de | el | bg | ru |tr |ar |vi | th | zh | hi | sw | ur |
29
+ |--------------| ----|----|----|---- |-- |-- |-- | -- |-- |-- |-- | -- | -- | -- | -- | -- |
30
+ | XLM-R-base |75.6 |85.8|79.7|80.7 |78.7 |77.5 |79.6 |78.1 |74.2 |73.8 |76.5 |74.6 |76.7| 72.4| 66.5| 68.3|
31
+ | mDeBERTa-base|**79.8**+/-0.2|88.2|82.6|84.4 |82.7 |82.3 |82.4 |80.8 |79.5 |78.5 |78.1 |76.4 |79.5| 75.9| 73.9| 72.4|
32
 
33
  #### Fine-tuning with HF transformers
34