File size: 544 Bytes
cafba46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
I'm going to compare DARE merges using this (mostly vanilla, alpaca-tinted) 20b model vs using Harmonia.


slices:

  - sources:
    - model: athirdpath/alpaca-2-13b-english_full-model
    - 
      layer_range: [0, 16]
  - sources:
    - model: TheBloke/Llama-2-13B-fp16
    - 
      layer_range: [8, 24]
  - sources:
    - model: athirdpath/alpaca-2-13b-english_full-model
    - 
      layer_range: [17, 32]
  - sources:
    - model: TheBloke/Llama-2-13B-fp16
    - 
      layer_range: [25, 40]
      
merge_method: passthrough

dtype: float16