automerger commited on
Commit
ea04667
1 Parent(s): 071a1e7

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +6 -15
README.md CHANGED
@@ -5,31 +5,22 @@ tags:
5
  - mergekit
6
  - lazymergekit
7
  - automerger
8
- base_model:
9
- - AurelPx/Percival_01-7b-slerp
10
  ---
11
 
12
  # Experiment26Percival_01-7B
13
 
14
  Experiment26Percival_01-7B is an automated merge created by [Maxime Labonne](https://huggingface.co/mlabonne) using the following configuration.
15
- * [AurelPx/Percival_01-7b-slerp](https://huggingface.co/AurelPx/Percival_01-7b-slerp)
16
 
17
  ## 🧩 Configuration
18
 
19
  ```yaml
20
  models:
21
- - model: yam-peleg/Experiment26-7B
22
- # No parameters necessary for base model
23
- - model: AurelPx/Percival_01-7b-slerp
24
- parameters:
25
- density: 0.53
26
- weight: 0.6
27
- merge_method: dare_ties
28
- base_model: yam-peleg/Experiment26-7B
29
- parameters:
30
- int8_mask: true
31
- dtype: bfloat16
32
- random_seed: 0
33
  ```
34
 
35
  ## 💻 Usage
 
5
  - mergekit
6
  - lazymergekit
7
  - automerger
 
 
8
  ---
9
 
10
  # Experiment26Percival_01-7B
11
 
12
  Experiment26Percival_01-7B is an automated merge created by [Maxime Labonne](https://huggingface.co/mlabonne) using the following configuration.
 
13
 
14
  ## 🧩 Configuration
15
 
16
  ```yaml
17
  models:
18
+ - model: mistralai/Mistral-7B-v0.1
19
+ - model: yam-peleg/Experiment26-7B
20
+ - model: AurelPx/Percival_01-7b-slerp
21
+ merge_method: model_stock
22
+ base_model: mistralai/Mistral-7B-v0.1
23
+ dtype: bfloat16
 
 
 
 
 
 
24
  ```
25
 
26
  ## 💻 Usage