Burn Oil commited on
Commit
18905b4
β€’
1 Parent(s): b6348ae

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -41
README.md CHANGED
@@ -1,41 +1,44 @@
1
- ---
2
- base_model:
3
- - Sao10K/Fimbulvetr-11B-v2
4
- - Undi95/Mistral-11B-CC-Air-RP
5
- library_name: transformers
6
- tags:
7
- - mergekit
8
- - merge
9
-
10
- ---
11
- # merge
12
-
13
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
14
-
15
- ## Merge Details
16
- ### Merge Method
17
-
18
- This model was merged using the passthrough merge method.
19
-
20
- ### Models Merged
21
-
22
- The following models were included in the merge:
23
- * [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2)
24
- * [Undi95/Mistral-11B-CC-Air-RP](https://huggingface.co/Undi95/Mistral-11B-CC-Air-RP)
25
-
26
- ### Configuration
27
-
28
- The following YAML configuration was used to produce this model:
29
-
30
- ```yaml
31
- slices:
32
- - sources:
33
- - model: Sao10K/Fimbulvetr-11B-v2
34
- layer_range: [0, 40]
35
- - sources:
36
- - model: Undi95/Mistral-11B-CC-Air-RP
37
- layer_range: [8, 48]
38
- merge_method: passthrough
39
- dtype: bfloat16
40
-
41
- ```
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - Sao10K/Fimbulvetr-11B-v2
4
+ - Undi95/Mistral-11B-CC-Air-RP
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+ - πŸ‘
10
+ ---
11
+ # Fimbul-Airo-18B πŸ‘
12
+
13
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). πŸ‘
14
+
15
+ ## Merge Details
16
+ ### Merge Method
17
+
18
+ This model was merged using the passthrough merge method. Taking a buncha models and smashing em all together πŸ‘
19
+
20
+ ### Models Merged
21
+
22
+ The following models were included in the merge:
23
+ * [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2) πŸ‘
24
+ * [Undi95/Mistral-11B-CC-Air-RP](https://huggingface.co/Undi95/Mistral-11B-CC-Air-RP) πŸ‘
25
+
26
+ ### The Sauce
27
+
28
+ The following YAML configuration was used to produce this model:
29
+
30
+ ```yaml
31
+ slices:
32
+ - sources:
33
+ - model: Sao10K/Fimbulvetr-11B-v2
34
+ layer_range: [0, 40]
35
+ - sources:
36
+ - model: Undi95/Mistral-11B-CC-Air-RP
37
+ layer_range: [8, 48]
38
+ merge_method: passthrough
39
+ dtype: bfloat16
40
+
41
+ ```
42
+ πŸ‘
43
+
44
+ Don't forget to take care of yourself and have a wonderful day!