|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
## zen |
|
|
|
|
|
Moe [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210) and [berkeley-nest/Starling-LM-7B-alpha](https://huggingface.co/berkeley-nest/Starling-LM-7B-alpha) using mergekit-moe. |
|
|
|
You can use ChatML format. |
|
|
|
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) |
|
Detailed results can be found [Coming soon]() |
|
|
|
| Metric | Value | |
|
|-----------------------|---------------------------| |
|
| Avg. | Coming soon | |
|
| ARC (25-shot) | Coming soon | |
|
| HellaSwag (10-shot) | Coming soon | |
|
| MMLU (5-shot) | Coming soon | |
|
| TruthfulQA (0-shot) | Coming soon | |
|
| Winogrande (5-shot) | Coming soon | |
|
| GSM8K (5-shot) | Coming soon | |