File size: 1,432 Bytes
0bea985
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
base_model: []
tags:
- mergekit
- merge

---
# Steelskull/Etheria-55b-v0.1

![image/png](https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/RAhrbktyyVQxOR1np-9L2.png)

## Merge Details

An attempt to make a functional goliath style merge to create a [Etheria] 55b-200k with two yi-34b-200k models.

due to the merge it 'theoretically' should have a context of 200k but I recommend starting at 32k and moveing up,
as it is unknown (at this time) what the merge has done to the context length.

This is a merge of both VerA and VerB of Etheria-55b (There numbers were surprisingly good), I then created a sacrificial 55B out of the most performant yi-34b-200k Model
and performed a Dare_ties merge and equalize the model into its current state.

### Merge Method

This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using Merged-Etheria-55b as a base.

### Configuration

The following YAML configuration was used to produce this model:

```yaml

base_model: Merged-Etheria-55b
models:
  - model: Sacr-Etheria-55b
    parameters:
      weight: [0.22, 0.113, 0.113, 0.113, 0.113, 0.113]
      density: 0.61
  - model: Merged-Etheria-55b
    parameters:
      weight: [0.22, 0.113, 0.113, 0.113, 0.113, 0.113]
      density: 0.61
merge_method: dare_ties
tokenizer_source: union
parameters:
  int8_mask: true
dtype: bfloat16

```