File size: 350 Bytes
b52c1c5 71755cc b52c1c5 71755cc b52c1c5 5d81813 |
1 2 3 4 5 6 7 8 9 10 11 12 |
This is a my first Llama 3 MoE model with the following configs:
base_model: Llama-3-RPMerge-8B-SLERP
experts:
- source_model: Llama-3-RPMerge-8B-SLERP
- source_model: WesPro_Daring_Llama
- source_model: Chaos_RP_l3_8B
- source_model: llama-3-stinky-8B
It's meant for RP and does pretty well at it but I haven't tested it excessively yet. |