phmartins commited on
Commit
a43dcb9
1 Parent(s): 43d767a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -11
README.md CHANGED
@@ -107,7 +107,7 @@ The results show that EuroLLM-1.7B is substantially better than Gemma-2B in Mach
107
  #### Flores-200
108
  | Model | AVG | AVG en-xx | AVG xx-en | en-ar | en-bg | en-ca | en-cs | en-da | en-de | en-el | en-es-latam | en-et | en-fi | en-fr | en-ga | en-gl | en-hi | en-hr | en-hu | en-it | en-ja | en-ko | en-lt | en-lv | en-mt | en-nl | en-no | en-pl | en-pt-br | en-ro | en-ru | en-sk | en-sl | en-sv | en-tr | en-uk | en-zh-cn | ar-en | bg-en | ca-en | cs-en | da-en | de-en | el-en | es-latam-en | et-en | fi-en | fr-en | ga-en | gl-en | hi-en | hr-en | hu-en | it-en | ja-en | ko-en | lt-en | lv-en | mt-en | nl-en | no-en | pl-en | pt-br-en | ro-en | ru-en | sk-en | sl-en | sv-en | tr-en | uk-en | zh-cn-en |
109
  |--------------------------------|------|-----------|-----------|-------|-------|-------|-------|-------|-------|-------|--------------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|----------|-------|-------|-------|-------|-------|-------|-------|----------|-------|-------|-------|-------|-------|-------|-------|--------------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|----------|-------|-------|-------|-------|-------|-------|-------|----------|
110
- | EuroLLM-1.7B-Instruct | 86.10| 85.53 | 86.67 | 83.87 | 88.36 | 84.42 | 88.34 | 88.77 | 86.63 | 86.71 | 85.99 | 86.98 | 87.13 | 87.21 | 72.25 | 85.97 | 74.78 | 82.96 | 85.51 | 87.77 | 89.26 | 86.27 | 86.31 | 86.22 | 67.38 | 86.95 | 88.68 | 87.38 | 89.13 | 88.39 | 87.47 | 87.51 | 85.32 | 89.20 | 86.24 | 86.33 | 86.17 | 85.80 | 87.20 | 87.53 | 87.53 | 89.26 | 88.71 | 86.49 | 86.55 | 87.60 | 88.17 | 88.90 | 79.89 | 87.59 | 87.53 | 86.10 | 86.34 | 87.54 | 86.25 | 86.08 | 85.03 | 85.60 | 78.16 | 86.80 | 89.96 | 85.24 | 88.85 | 88.42 | 85.86 | 87.17 | 86.36 | 89.48 | 86.76 | 86.06 | 85.88 |
111
  | Gemma-2B-EuroBlocks | 81.56| 78.93 | 84.18 | 75.25 | 82.46 | 83.17 | 82.17 | 84.40 | 83.20 | 79.63 | 84.15 | 72.63 | 81.00 | 85.12 | 38.79 | 82.00 | 67.00 | 81.18 | 78.24 | 84.80 | 87.08 | 82.04 | 73.02 | 68.41 | 56.67 | 83.30 | 86.69 | 83.07 | 86.82 | 84.00 | 84.55 | 77.93 | 76.19 | 80.77 | 79.76 | 84.19 | 84.10 | 83.67 | 85.73 | 86.89 | 86.38 | 88.39 | 88.11 | 84.68 | 86.11 | 83.45 | 86.45 | 88.22 | 50.88 | 86.44 | 85.87 | 85.33 | 85.16 | 86.75 | 85.62 | 85.00 | 81.55 | 81.45 | 67.90 | 85.95 | 89.05 | 84.18 | 88.27 | 87.38 | 85.13 | 85.22 | 83.86 | 87.83 | 84.96 | 85.15 | 85.10 |
112
  | Gemma-7B-EuroBlocks | 86.16| 85.49 | 86.82 | 83.39 | 88.32 | 85.82 | 88.88 | 89.01 | 86.96 | 86.62 | 86.31 | 84.42 | 88.11 | 87.46 | 61.85 | 86.10 | 77.91 | 87.01 | 85.81 | 87.57 | 89.88 | 87.24 | 84.47 | 83.15 | 67.13 | 86.50 | 90.44 | 87.57 | 89.22 | 89.13 | 88.58 | 86.73 | 84.68 | 88.16 | 86.87 | 88.40 | 87.11 | 86.65 | 87.25 | 88.17 | 87.47 | 89.59 | 88.44 | 86.76 | 86.66 | 87.55 | 88.88 | 88.86 | 73.46 | 87.63 | 88.43 | 87.12 | 87.31 | 87.49 | 87.20 | 87.15 | 85.16 | 85.96 | 78.39 | 86.73 | 90.52 | 85.38 | 89.17 | 88.75 | 86.35 | 86.82 | 86.21 | 89.39 | 88.20 | 86.45 | 86.28 |
113
 
@@ -115,7 +115,7 @@ The results show that EuroLLM-1.7B is substantially better than Gemma-2B in Mach
115
  #### WMT-23
116
  | Model | AVG | AVG en-xx | AVG xx-en | AVG xx-xx | en-de | en-cs | en-uk | en-ru | en-zh-cn | de-en | uk-en | ru-en | zh-cn-en | cs-uk |
117
  |--------------------------------|------|-----------|-----------|-----------|-------|-------|-------|-------|----------|-------|-------|-------|----------|-------|
118
- | EuroLLM-1.7B-Instruct | 82.56| 82.30 | 82.07 | 85.81 | 80.99 | 84.42 | 80.74 | 81.94 | 83.42 | 83.74 | 85.06 | 81.00 | 78.49 | 85.81 |
119
  | Gemma-2B-EuroBlocks | 79.86| 78.35 | 81.32 | 81.56 | 76.54 | 76.35 | 77.62 | 78.88 | 82.36 | 82.85 | 83.83 | 80.17 | 78.42 | 81.56 |
120
  | Gemma-7B-EuroBlocks | 83.90| 83.70 | 83.21 | 87.61 | 82.15 | 84.68 | 83.05 | 83.85 | 84.79 | 84.40 | 85.86 | 82.55 | 80.01 | 87.61 |
121
 
@@ -123,12 +123,12 @@ The results show that EuroLLM-1.7B is substantially better than Gemma-2B in Mach
123
  #### WMT-24
124
  | Model | AVG | AVG en-xx | AVG xx-xx | en-es-latam | en-cs | en-ru | en-uk | en-ja | en-zh-cn | en-hi | cs-uk | ja-zh-cn |
125
  |---------|------|------|-------|-------|-------|-------|--------|--------|-------|-------|-------|-----|
126
- | EuroLLM-1.7B-Instruct| 78.45|78.65|77.67|79.05|80.93|80.33|78.05|78.72|81.87|80.15|70.10|82.65|72.69|
127
  |Gemma-2B-EuroBlocks | 74.71|74.25|76.57|75.21|78.84|70.40|74.44|75.55|78.32|78.70|62.51|79.97|73.17|
128
  |Gemma-7B-EuroBlocks | 80.88|80.45|82.60|80.43|81.91|80.14|80.32|82.17|84.08|81.86|72.71|85.55|79.65|
129
 
130
  ### General Benchmarks
131
- We also compare EuroLLM-1.7B with [TinyLlama-1.1-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T) and [Gemma-2B](https://huggingface.co/google/gemma-2b) on 3 general benchmarks: Arc Challenge, Hellaswag, and MMLU.
132
  For the non-english languages we use the [Okapi](https://aclanthology.org/2023.emnlp-demo.28.pdf) datasets.
133
  Results show that EuroLLM-1.7B is superior to TinyLlama-1.1-3T and similar to Gemma-2B on Hellaswag but worse on Arc Challenge and MMLU. This can be due to the lower number of parameters of EuroLLM-1.7B (1.133B non-embedding parameters against 1.981B).
134
 
@@ -145,10 +145,3 @@ Results show that EuroLLM-1.7B is superior to TinyLlama-1.1-3T and similar to Ge
145
  | EuroLLM-1.7B | 0.4653 | 0.6199 | 0.4653 | 0.5187 | 0.5173 | 0.5024 | 0.5116 | 0.4582 | 0.4821 | 0.3939 | 0.4722 | 0.3505 | 0.3970 | 0.4441 | 0.4224 | 0.4556 | 0.4329 |
146
  | TinyLlama-1.1-3T | 0.3710 | 0.6027 | 0.3652 | 0.4136 | 0.4104 | 0.3780 | 0.4008 | 0.3544 | 0.3637 | 0.2981 | 0.3569 | 0.2904 | 0.3147 | 0.3337 | 0.3440 | 0.3464 | 0.3628 |
147
  | Gemma-2B | 0.4666 | 0.7165 | 0.4756 | 0.5414 | 0.5180 | 0.4841 | 0.5081 | 0.4664 | 0.4655 | 0.3868 | 0.4383 | 0.3413 | 0.3710 | 0.4316 | 0.4291 | 0.4471 | 0.4448 |
148
-
149
- #### MMLU
150
- | Model | Average | English | German | Spanish | French | Italian | Portuguese | Chinese | Russian | Dutch | Arabic | Swedish | Hindi | Hungarian | Romanian | Ukrainian | Danish | Catalan |
151
- |--------------------|---------|---------|--------|---------|--------|---------|------------|---------|---------|--------|--------|---------|--------|-----------|----------|-----------|--------|---------|
152
- | EuroLLM-1.7B | 0.2631 | 0.2553 | 0.2626 | 0.2653 | 0.2589 | 0.2628 | 0.2634 | 0.2546 | 0.2626 | 0.2677 | 0.2608 | 0.2656 | 0.2690 | 0.2551 | 0.2677 | 0.2655 | 0.2675 | 0.2689 |
153
- | TinyLlama-1.1-3T | 0.2546 | 0.2604 | 0.2498 | 0.2528 | 0.2535 | 0.2531 | 0.2511 | 0.2629 | 0.2541 | 0.2521 | 0.2591 | 0.2528 | 0.2550 | 0.2566 | 0.2548 | 0.2651 | 0.2419 | 0.2528 |
154
- | Gemma-2B | 0.3356 | 0.4168 | 0.3519 | 0.3475 | 0.3463 | 0.3433 | 0.3383 | 0.3345 | 0.3261 | 0.3429 | 0.3158 | 0.3318 | 0.2842 | 0.3185 | 0.3243 | 0.3152 | 0.3377 | 0.3307 |
 
107
  #### Flores-200
108
  | Model | AVG | AVG en-xx | AVG xx-en | en-ar | en-bg | en-ca | en-cs | en-da | en-de | en-el | en-es-latam | en-et | en-fi | en-fr | en-ga | en-gl | en-hi | en-hr | en-hu | en-it | en-ja | en-ko | en-lt | en-lv | en-mt | en-nl | en-no | en-pl | en-pt-br | en-ro | en-ru | en-sk | en-sl | en-sv | en-tr | en-uk | en-zh-cn | ar-en | bg-en | ca-en | cs-en | da-en | de-en | el-en | es-latam-en | et-en | fi-en | fr-en | ga-en | gl-en | hi-en | hr-en | hu-en | it-en | ja-en | ko-en | lt-en | lv-en | mt-en | nl-en | no-en | pl-en | pt-br-en | ro-en | ru-en | sk-en | sl-en | sv-en | tr-en | uk-en | zh-cn-en |
109
  |--------------------------------|------|-----------|-----------|-------|-------|-------|-------|-------|-------|-------|--------------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|----------|-------|-------|-------|-------|-------|-------|-------|----------|-------|-------|-------|-------|-------|-------|-------|--------------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|-------|----------|-------|-------|-------|-------|-------|-------|-------|----------|
110
+ | EuroLLM-1.7B-Instruct | 86.75| 86.49| 87.01| |85.00| 89.37| 84.65| 89.23| 89.54| 87.00| 87.68| 86.43| 88.66| 89.18| 87.65| 74.66| 86.68| 76.94| 84.86| 86.43| 88.19| 89.45| 87.33| 87.73| 87.74| 67.66| 87.16| 90.08| 88.10| 89.39| 89.39| 88.11| 88.13| 87.08| 89.67| 87.21| 87.76| 86.45| 86.15| 87.46| 87.49| 87.97| 89.65| 88.80| 86.88| 86.70| 88.19| 88.66| 88.93| 81.95| 87.38| 87.97| 86.69| 87.15| 87.69| 86.78| 86.76| 85.39| 86.23| 76.93| 87.02| 90.24| 85.54| 89.22| 88.81| 86.02| 87.24| 86.81| 89.55| 87.76| 86.37| 86.00|
111
  | Gemma-2B-EuroBlocks | 81.56| 78.93 | 84.18 | 75.25 | 82.46 | 83.17 | 82.17 | 84.40 | 83.20 | 79.63 | 84.15 | 72.63 | 81.00 | 85.12 | 38.79 | 82.00 | 67.00 | 81.18 | 78.24 | 84.80 | 87.08 | 82.04 | 73.02 | 68.41 | 56.67 | 83.30 | 86.69 | 83.07 | 86.82 | 84.00 | 84.55 | 77.93 | 76.19 | 80.77 | 79.76 | 84.19 | 84.10 | 83.67 | 85.73 | 86.89 | 86.38 | 88.39 | 88.11 | 84.68 | 86.11 | 83.45 | 86.45 | 88.22 | 50.88 | 86.44 | 85.87 | 85.33 | 85.16 | 86.75 | 85.62 | 85.00 | 81.55 | 81.45 | 67.90 | 85.95 | 89.05 | 84.18 | 88.27 | 87.38 | 85.13 | 85.22 | 83.86 | 87.83 | 84.96 | 85.15 | 85.10 |
112
  | Gemma-7B-EuroBlocks | 86.16| 85.49 | 86.82 | 83.39 | 88.32 | 85.82 | 88.88 | 89.01 | 86.96 | 86.62 | 86.31 | 84.42 | 88.11 | 87.46 | 61.85 | 86.10 | 77.91 | 87.01 | 85.81 | 87.57 | 89.88 | 87.24 | 84.47 | 83.15 | 67.13 | 86.50 | 90.44 | 87.57 | 89.22 | 89.13 | 88.58 | 86.73 | 84.68 | 88.16 | 86.87 | 88.40 | 87.11 | 86.65 | 87.25 | 88.17 | 87.47 | 89.59 | 88.44 | 86.76 | 86.66 | 87.55 | 88.88 | 88.86 | 73.46 | 87.63 | 88.43 | 87.12 | 87.31 | 87.49 | 87.20 | 87.15 | 85.16 | 85.96 | 78.39 | 86.73 | 90.52 | 85.38 | 89.17 | 88.75 | 86.35 | 86.82 | 86.21 | 89.39 | 88.20 | 86.45 | 86.28 |
113
 
 
115
  #### WMT-23
116
  | Model | AVG | AVG en-xx | AVG xx-en | AVG xx-xx | en-de | en-cs | en-uk | en-ru | en-zh-cn | de-en | uk-en | ru-en | zh-cn-en | cs-uk |
117
  |--------------------------------|------|-----------|-----------|-----------|-------|-------|-------|-------|----------|-------|-------|-------|----------|-------|
118
+ | EuroLLM-1.7B-Instruct | 83.13 | 82.91 | 82.48 | 86.87| 81.33| 85.42| 81.61| 82.57| 83.62| 84.24| 85.36| 81.56| 78.76| 86.87 |
119
  | Gemma-2B-EuroBlocks | 79.86| 78.35 | 81.32 | 81.56 | 76.54 | 76.35 | 77.62 | 78.88 | 82.36 | 82.85 | 83.83 | 80.17 | 78.42 | 81.56 |
120
  | Gemma-7B-EuroBlocks | 83.90| 83.70 | 83.21 | 87.61 | 82.15 | 84.68 | 83.05 | 83.85 | 84.79 | 84.40 | 85.86 | 82.55 | 80.01 | 87.61 |
121
 
 
123
  #### WMT-24
124
  | Model | AVG | AVG en-xx | AVG xx-xx | en-es-latam | en-cs | en-ru | en-uk | en-ja | en-zh-cn | en-hi | cs-uk | ja-zh-cn |
125
  |---------|------|------|-------|-------|-------|-------|--------|--------|-------|-------|-------|-----|
126
+ | EuroLLM-1.7B-Instruct|79.35|79.45|78.96|79.20|81.17|80.82|79.00|80.54|82.39|80.80|71.69|83.16|74.76|
127
  |Gemma-2B-EuroBlocks | 74.71|74.25|76.57|75.21|78.84|70.40|74.44|75.55|78.32|78.70|62.51|79.97|73.17|
128
  |Gemma-7B-EuroBlocks | 80.88|80.45|82.60|80.43|81.91|80.14|80.32|82.17|84.08|81.86|72.71|85.55|79.65|
129
 
130
  ### General Benchmarks
131
+ We also compare EuroLLM-1.7B with [TinyLlama-1.1-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T) and [Gemma-2B](https://huggingface.co/google/gemma-2b) on 3 general benchmarks: Arc Challenge and Hellaswag.
132
  For the non-english languages we use the [Okapi](https://aclanthology.org/2023.emnlp-demo.28.pdf) datasets.
133
  Results show that EuroLLM-1.7B is superior to TinyLlama-1.1-3T and similar to Gemma-2B on Hellaswag but worse on Arc Challenge and MMLU. This can be due to the lower number of parameters of EuroLLM-1.7B (1.133B non-embedding parameters against 1.981B).
134
 
 
145
  | EuroLLM-1.7B | 0.4653 | 0.6199 | 0.4653 | 0.5187 | 0.5173 | 0.5024 | 0.5116 | 0.4582 | 0.4821 | 0.3939 | 0.4722 | 0.3505 | 0.3970 | 0.4441 | 0.4224 | 0.4556 | 0.4329 |
146
  | TinyLlama-1.1-3T | 0.3710 | 0.6027 | 0.3652 | 0.4136 | 0.4104 | 0.3780 | 0.4008 | 0.3544 | 0.3637 | 0.2981 | 0.3569 | 0.2904 | 0.3147 | 0.3337 | 0.3440 | 0.3464 | 0.3628 |
147
  | Gemma-2B | 0.4666 | 0.7165 | 0.4756 | 0.5414 | 0.5180 | 0.4841 | 0.5081 | 0.4664 | 0.4655 | 0.3868 | 0.4383 | 0.3413 | 0.3710 | 0.4316 | 0.4291 | 0.4471 | 0.4448 |