Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,10 @@ Roleplaying focused MoE Mistral model.
|
|
7 |
|
8 |
One expert is a merge of mostly RP models, the other is a merge of mostly storywriting models. So it should be good at both. The base model is SanjiWatsuki/Kunoichi-DPO-v2-7B.
|
9 |
|
10 |
-
|
|
|
|
|
|
|
11 |
|
12 |
```
|
13 |
### Instruction:
|
|
|
7 |
|
8 |
One expert is a merge of mostly RP models, the other is a merge of mostly storywriting models. So it should be good at both. The base model is SanjiWatsuki/Kunoichi-DPO-v2-7B.
|
9 |
|
10 |
+
- Expert 1 is a merge of LimaRP, Limamono, Noromaid 0.4 DPO and good-robot.
|
11 |
+
- Expert 2 is a merge of Erebus, Holodeck, Dans-AdventurousWinds-Mk2, Opus, Ashhwriter and good-robot.
|
12 |
+
|
13 |
+
## Prompt template (LimaRP):
|
14 |
|
15 |
```
|
16 |
### Instruction:
|