Update README.md
Browse files
README.md
CHANGED
@@ -109,7 +109,7 @@ model-index:
|
|
109 |
|
110 |
# MixTAO-7Bx2-MoE
|
111 |
|
112 |
-
MixTAO-7Bx2-MoE is a
|
113 |
This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
|
114 |
|
115 |
### 🦒 Colab
|
|
|
109 |
|
110 |
# MixTAO-7Bx2-MoE
|
111 |
|
112 |
+
MixTAO-7Bx2-MoE is a Mixture of Experts (MoE).
|
113 |
This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
|
114 |
|
115 |
### 🦒 Colab
|