changed license
Browse filessee discussion with
@mrfakename
: https://huggingface.co/mlabonne/Beyonder-4x7B-v2/discussions/1#65986b41c6457161ca53eb00
README.md
CHANGED
@@ -1,7 +1,13 @@
|
|
1 |
---
|
2 |
-
license:
|
|
|
|
|
3 |
tags:
|
4 |
- moe
|
|
|
|
|
|
|
|
|
5 |
---
|
6 |
|
7 |
![](https://i.imgur.com/vq1QHEA.jpg)
|
@@ -18,9 +24,9 @@ This model is a Mixture of Experts (MoE) made with [mergekit](https://github.com
|
|
18 |
|
19 |
Thanks to TheBloke for the quantized models:
|
20 |
|
21 |
-
* GGUF
|
22 |
-
* AWQ
|
23 |
-
* GPTQ
|
24 |
|
25 |
## 🏆 Evaluation
|
26 |
|
|
|
1 |
---
|
2 |
+
license: other
|
3 |
+
license_name: microsoft-research-license
|
4 |
+
license_link: https://huggingface.co/WizardLM/WizardMath-7B-V1.1/resolve/main/LICENSE
|
5 |
tags:
|
6 |
- moe
|
7 |
+
- openchat/openchat-3.5-1210
|
8 |
+
- beowolx/CodeNinja-1.0-OpenChat-7B
|
9 |
+
- maywell/PiVoT-0.1-Starling-LM-RP
|
10 |
+
- WizardLM/WizardMath-7B-V1.1
|
11 |
---
|
12 |
|
13 |
![](https://i.imgur.com/vq1QHEA.jpg)
|
|
|
24 |
|
25 |
Thanks to TheBloke for the quantized models:
|
26 |
|
27 |
+
* **GGUF**: https://huggingface.co/TheBloke/Beyonder-4x7B-v2-GGUF
|
28 |
+
* **AWQ**: https://huggingface.co/TheBloke/Beyonder-4x7B-v2-AWQ
|
29 |
+
* **GPTQ**: https://huggingface.co/TheBloke/Beyonder-4x7B-v2-GPTQ
|
30 |
|
31 |
## 🏆 Evaluation
|
32 |
|