Update README.md
Browse files
README.md
CHANGED
@@ -337,9 +337,7 @@ output = generate(
|
|
337 |
print(output[0]["generated_text"])
|
338 |
```
|
339 |
|
340 |
-
##
|
341 |
-
|
342 |
-
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Llama-160M-Chat-v1)
|
343 |
|
344 |
| Metric |Value|
|
345 |
|---------------------------------|----:|
|
@@ -351,7 +349,7 @@ Detailed results can be found [here](https://huggingface.co/datasets/open-llm-le
|
|
351 |
|Winogrande (5-shot) |51.30|
|
352 |
|GSM8k (5-shot) | 0.00|
|
353 |
|
354 |
-
|
355 |
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Llama-160M-Chat-v1)
|
356 |
|
357 |
| Metric |Value|
|
|
|
337 |
print(output[0]["generated_text"])
|
338 |
```
|
339 |
|
340 |
+
## Old Open LLM Leaderboard Evaluation Results
|
|
|
|
|
341 |
|
342 |
| Metric |Value|
|
343 |
|---------------------------------|----:|
|
|
|
349 |
|Winogrande (5-shot) |51.30|
|
350 |
|GSM8k (5-shot) | 0.00|
|
351 |
|
352 |
+
## [New Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard)
|
353 |
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Llama-160M-Chat-v1)
|
354 |
|
355 |
| Metric |Value|
|