Update README.md
Browse files
README.md
CHANGED
@@ -15,7 +15,7 @@ tags:
|
|
15 |
# Qwen1.5-MoE-2x7B
|
16 |
|
17 |
## Description
|
18 |
-
This model is created using MoE (Mixture of Experts) through mergekit based on [Qwen/Qwen1.5-7B-Chat](https://huggingface.co/Qwen/Qwen1.5-7B-Chat) and [abacusai/Liberated-Qwen1.5-7B](https://huggingface.co/abacusai/Liberated-Qwen1.5-7B).
|
19 |
|
20 |
It utilizes a customized script for MoE via mergekit, which is available [here](https://github.com/Aratako/mergekit-qwen2).
|
21 |
|
|
|
15 |
# Qwen1.5-MoE-2x7B
|
16 |
|
17 |
## Description
|
18 |
+
This model is created using MoE (Mixture of Experts) through mergekit based on [Qwen/Qwen1.5-7B-Chat](https://huggingface.co/Qwen/Qwen1.5-7B-Chat) and [abacusai/Liberated-Qwen1.5-7B](https://huggingface.co/abacusai/Liberated-Qwen1.5-7B) without further FT.
|
19 |
|
20 |
It utilizes a customized script for MoE via mergekit, which is available [here](https://github.com/Aratako/mergekit-qwen2).
|
21 |
|