Datasets:
Add Hermes3, Llama3.1 70B (#6)
Browse files- Add Hermes3, Llama3.1 70B (3c02b900a5bad747e422f01b9a3d0fca3e267c00)
Co-authored-by: Sinisa Stanivuk <Stopwolf@users.noreply.huggingface.co>
README.md
CHANGED
@@ -75,20 +75,23 @@ accelerate launch lighteval/run_evals_accelerate.py \
|
|
75 |
| Model |Size|Accuracy| |Stderr|
|
76 |
|-------|---:|-------:|--|-----:|
|
77 |
|GPT-4-0125-preview|_???_|0.9199|±|0.002|
|
78 |
-
|GPT-4o-2024-05-13|
|
79 |
-
|GPT-3.5-turbo-0125|
|
|
|
80 |
|GPT-4o-mini-2024-07-18|_???_|0.7971|±|0.0005|
|
81 |
|[Mustra-7B-Instruct-v0.2](https://huggingface.co/Stopwolf/Mustra-7B-Instruct-v0.2)|7B|0.7388|± |0.0098|
|
82 |
|[Tito-7B-slerp](https://huggingface.co/Stopwolf/Tito-7B-slerp)|7B|0.7099|±|0.0101|
|
83 |
|[Yugo55A-GPT](https://huggingface.co/datatab/Yugo55A-GPT)|7B|0.6889|± |0.0103|
|
84 |
|[Zamfir-7B-slerp](https://huggingface.co/Stopwolf/Zamfir-7B-slerp)|7B|0.6849|± |0.0104|
|
85 |
|[Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407)|12.2B|0.6839|± |0.0104|
|
|
|
86 |
[Qwen2-7B-instruct](https://huggingface.co/Qwen/Qwen2-7B-Instruct)|7B|0.6730|±|0.0105|
|
87 |
|[Llama-3-SauerkrautLM-8b-Instruct](https://huggingface.co/VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct)|8B|0.661|± |0.0106|
|
88 |
|[Yugo60-GPT](https://huggingface.co/datatab/Yugo60-GPT)|7B|0.6411|±|0.0107|
|
89 |
|[DeepSeek-V2-Lite-Chat](https://huggingface.co/deepseek-ai/DeepSeek-V2-Lite-Chat)|15.7B|0.6047|±|0.0109|
|
90 |
|[Llama3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct)|8B|0.5972|±| 0.0155|
|
91 |
-
|[Llama3-70B-Instruct
|
|
|
92 |
|[Hermes-2-Theta-Llama-3-8B](https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B)|8B|0.5852|±|0.011|
|
93 |
|[Mistral-7B-Instruct-v0.3](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3)|7B|0.5753|±| 0.011|
|
94 |
|[openchat-3.6-8b-20240522](https://huggingface.co/openchat/openchat-3.6-8b-20240522)|8B|0.5513|±|0.0111|
|
|
|
75 |
| Model |Size|Accuracy| |Stderr|
|
76 |
|-------|---:|-------:|--|-----:|
|
77 |
|GPT-4-0125-preview|_???_|0.9199|±|0.002|
|
78 |
+
|GPT-4o-2024-05-13|_???_|0.9196|±|0.0017|
|
79 |
+
|GPT-3.5-turbo-0125|_???_|0.8245|±|0.0016|
|
80 |
+
|[Llama3.1-70B-Instruct \[4bit\]](https://huggingface.co/unsloth/Meta-Llama-3.1-70B-bnb-4bit)|70B|0.8185|±| 0.0122|
|
81 |
|GPT-4o-mini-2024-07-18|_???_|0.7971|±|0.0005|
|
82 |
|[Mustra-7B-Instruct-v0.2](https://huggingface.co/Stopwolf/Mustra-7B-Instruct-v0.2)|7B|0.7388|± |0.0098|
|
83 |
|[Tito-7B-slerp](https://huggingface.co/Stopwolf/Tito-7B-slerp)|7B|0.7099|±|0.0101|
|
84 |
|[Yugo55A-GPT](https://huggingface.co/datatab/Yugo55A-GPT)|7B|0.6889|± |0.0103|
|
85 |
|[Zamfir-7B-slerp](https://huggingface.co/Stopwolf/Zamfir-7B-slerp)|7B|0.6849|± |0.0104|
|
86 |
|[Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407)|12.2B|0.6839|± |0.0104|
|
87 |
+
|[Llama-3.1-SauerkrautLM-8b-Instruct](https://huggingface.co/VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct)|8B|0.679|±|0.0147|
|
88 |
[Qwen2-7B-instruct](https://huggingface.co/Qwen/Qwen2-7B-Instruct)|7B|0.6730|±|0.0105|
|
89 |
|[Llama-3-SauerkrautLM-8b-Instruct](https://huggingface.co/VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct)|8B|0.661|± |0.0106|
|
90 |
|[Yugo60-GPT](https://huggingface.co/datatab/Yugo60-GPT)|7B|0.6411|±|0.0107|
|
91 |
|[DeepSeek-V2-Lite-Chat](https://huggingface.co/deepseek-ai/DeepSeek-V2-Lite-Chat)|15.7B|0.6047|±|0.0109|
|
92 |
|[Llama3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct)|8B|0.5972|±| 0.0155|
|
93 |
+
|[Llama3-70B-Instruct \[4bit\]](https://huggingface.co/unsloth/llama-3-70b-Instruct-bnb-4bit)|70B|0.5942|±| 0.011|
|
94 |
+
|[Hermes-3-Theta-Llama-3-8B](https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B)|8B|0.5932|±|0.0155|
|
95 |
|[Hermes-2-Theta-Llama-3-8B](https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B)|8B|0.5852|±|0.011|
|
96 |
|[Mistral-7B-Instruct-v0.3](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3)|7B|0.5753|±| 0.011|
|
97 |
|[openchat-3.6-8b-20240522](https://huggingface.co/openchat/openchat-3.6-8b-20240522)|8B|0.5513|±|0.0111|
|