File size: 935 Bytes
55cd9be |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
---
tags:
- merge
- mergekit
- mistral
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
base_model:
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
- fhai50032/RolePlayLake-7B
---
# Mistral-4B
## 🧩 Configuration
```yaml
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 8]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [11, 12]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [15, 16]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [19, 20]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [24, 25]
model: fhai50032/RolePlayLake-7B
- sources:
- layer_range: [28, 32]
model: fhai50032/RolePlayLake-7B
``` |