satyamt commited on
Commit
823b4fc
1 Parent(s): d2154a7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -33,6 +33,8 @@ Medorca-2x7b is a Mixure of Experts (MoE) made with the following models:
33
  | HellaSwag | 76.04 | **76.19** | | | |
34
  | Winogrande | **74.51** | 73.48 | | | |
35
 
 
 
36
  ## 🧩 Configuration
37
 
38
  ```yaml
 
33
  | HellaSwag | 76.04 | **76.19** | | | |
34
  | Winogrande | **74.51** | 73.48 | | | |
35
 
36
+ More details on the Open LLM Leaderboard evaluation results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medorca-2x7b.)
37
+
38
  ## 🧩 Configuration
39
 
40
  ```yaml