FallenMerick commited on
Commit
d7a3d04
1 Parent(s): c4a0867

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +73 -73
README.md CHANGED
@@ -1,73 +1,73 @@
1
- ---
2
- license: cc-by-4.0
3
- language:
4
- - en
5
- base_model:
6
- - SanjiWatsuki/Kunoichi-7B
7
- - SanjiWatsuki/Silicon-Maid-7B
8
- - KatyTheCutie/LemonadeRP-4.5.3
9
- - Sao10K/Fimbulvetr-11B-v2
10
- library_name: transformers
11
- tags:
12
- - mergekit
13
- - merge
14
- - mistral
15
- - text-generation
16
- - roleplay
17
-
18
- ---
19
-
20
- ![cute](https://huggingface.co/FallenMerick/Chunky-Lemon-Cookie-11B/resolve/main/Chewy-Lemon-Cookie.png)
21
-
22
- # Chewy-Lemon-Cookie-11B
23
-
24
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
25
-
26
- ## Merge Details
27
- ### Merge Method
28
-
29
- This model was merged using the following methods:
30
- * passthrough
31
- * [task arithmetic](https://arxiv.org/abs/2212.04089)
32
-
33
- ### Models Merged
34
-
35
- The following models were included in the merge:
36
- * [SanjiWatsuki/Kunoichi-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-7B)
37
- * [SanjiWatsuki/Silicon-Maid-7B](https://huggingface.co/SanjiWatsuki/Silicon-Maid-7B)
38
- * [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3)
39
- * [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2)
40
-
41
- ### Configuration
42
-
43
- The following YAML configurations were used to produce this model:
44
-
45
- ```yaml
46
- slices:
47
- - sources:
48
- - model: SanjiWatsuki/Kunoichi-7B
49
- layer_range: [0, 24]
50
- - sources:
51
- - model: SanjiWatsuki/Silicon-Maid-7B
52
- layer_range: [8, 24]
53
- - sources:
54
- - model: KatyTheCutie/LemonadeRP-4.5.3
55
- layer_range: [24, 32]
56
- merge_method: passthrough
57
- dtype: bfloat16
58
- name: Big-Lemon-Cookie-11B-BF16
59
-
60
- ---
61
-
62
- models:
63
- - model: Big-Lemon-Cookie-11B-BF16
64
- parameters:
65
- weight: 0.85
66
- - model: Sao10K/Fimbulvetr-11B-v2
67
- parameters:
68
- weight: 0.15
69
- merge_method: task_arithmetic
70
- base_model: Big-Lemon-Cookie-11B-BF16
71
- dtype: bfloat16
72
- name: Chewy-Lemon-Cookie-11B
73
- ```
 
1
+ ---
2
+ license: cc-by-4.0
3
+ language:
4
+ - en
5
+ base_model:
6
+ - SanjiWatsuki/Kunoichi-7B
7
+ - SanjiWatsuki/Silicon-Maid-7B
8
+ - KatyTheCutie/LemonadeRP-4.5.3
9
+ - Sao10K/Fimbulvetr-11B-v2
10
+ library_name: transformers
11
+ tags:
12
+ - mergekit
13
+ - merge
14
+ - mistral
15
+ - text-generation
16
+ - roleplay
17
+
18
+ ---
19
+
20
+ ![cute](https://huggingface.co/FallenMerick/Chewy-Lemon-Cookie-11B/resolve/main/Chewy-Lemon-Cookie.png)
21
+
22
+ # Chewy-Lemon-Cookie-11B
23
+
24
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
25
+
26
+ ## Merge Details
27
+ ### Merge Method
28
+
29
+ This model was merged using the following methods:
30
+ * passthrough
31
+ * [task arithmetic](https://arxiv.org/abs/2212.04089)
32
+
33
+ ### Models Merged
34
+
35
+ The following models were included in the merge:
36
+ * [SanjiWatsuki/Kunoichi-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-7B)
37
+ * [SanjiWatsuki/Silicon-Maid-7B](https://huggingface.co/SanjiWatsuki/Silicon-Maid-7B)
38
+ * [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3)
39
+ * [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2)
40
+
41
+ ### Configuration
42
+
43
+ The following YAML configurations were used to produce this model:
44
+
45
+ ```yaml
46
+ slices:
47
+ - sources:
48
+ - model: SanjiWatsuki/Kunoichi-7B
49
+ layer_range: [0, 24]
50
+ - sources:
51
+ - model: SanjiWatsuki/Silicon-Maid-7B
52
+ layer_range: [8, 24]
53
+ - sources:
54
+ - model: KatyTheCutie/LemonadeRP-4.5.3
55
+ layer_range: [24, 32]
56
+ merge_method: passthrough
57
+ dtype: bfloat16
58
+ name: Big-Lemon-Cookie-11B-BF16
59
+
60
+ ---
61
+
62
+ models:
63
+ - model: Big-Lemon-Cookie-11B-BF16
64
+ parameters:
65
+ weight: 0.85
66
+ - model: Sao10K/Fimbulvetr-11B-v2
67
+ parameters:
68
+ weight: 0.15
69
+ merge_method: task_arithmetic
70
+ base_model: Big-Lemon-Cookie-11B-BF16
71
+ dtype: bfloat16
72
+ name: Chewy-Lemon-Cookie-11B
73
+ ```