xxx777xxxASD
commited on
Commit
•
1f483f3
1
Parent(s):
204dac5
Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ language:
|
|
9 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f5e51289c121cb864ba464/m5urYkrpE5amrwHyaVwFM.png)
|
10 |
|
11 |
> [!IMPORTANT]
|
12 |
-
> [GGUF / Exl2](https://huggingface.co/collections/xxx777xxxASD/chaoticsoliloquy-v15-4x8b-6633f96430c0652a8ad527a3)
|
13 |
|
14 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than the Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
15 |
Im not sure but it should be better than the [first version](https://huggingface.co/xxx777xxxASD/ChaoticSoliloquy-4x8B)
|
|
|
9 |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f5e51289c121cb864ba464/m5urYkrpE5amrwHyaVwFM.png)
|
10 |
|
11 |
> [!IMPORTANT]
|
12 |
+
> [GGUF / Exl2 quants](https://huggingface.co/collections/xxx777xxxASD/chaoticsoliloquy-v15-4x8b-6633f96430c0652a8ad527a3)
|
13 |
|
14 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than the Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
15 |
Im not sure but it should be better than the [first version](https://huggingface.co/xxx777xxxASD/ChaoticSoliloquy-4x8B)
|