Update README.md
Browse files
README.md
CHANGED
@@ -3,15 +3,13 @@ license: apache-2.0
|
|
3 |
tags:
|
4 |
- moe
|
5 |
- merge
|
6 |
-
- mergekit
|
7 |
-
- lazymergekit
|
8 |
- epfl-llm/meditron-7b
|
9 |
- microsoft/Orca-2-7b
|
10 |
---
|
11 |
|
12 |
# Medstral-7B
|
13 |
|
14 |
-
Medstral-7B is a Mixure of Experts (MoE) made with the following models
|
15 |
* [epfl-llm/meditron-7b](https://huggingface.co/epfl-llm/meditron-7b)
|
16 |
* [microsoft/Orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b)
|
17 |
|
|
|
3 |
tags:
|
4 |
- moe
|
5 |
- merge
|
|
|
|
|
6 |
- epfl-llm/meditron-7b
|
7 |
- microsoft/Orca-2-7b
|
8 |
---
|
9 |
|
10 |
# Medstral-7B
|
11 |
|
12 |
+
Medstral-7B is a Mixure of Experts (MoE) made with the following models:
|
13 |
* [epfl-llm/meditron-7b](https://huggingface.co/epfl-llm/meditron-7b)
|
14 |
* [microsoft/Orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b)
|
15 |
|