|
--- |
|
license: llama2 |
|
language: |
|
- en |
|
pipeline_tag: conversational |
|
tags: |
|
- aurelian |
|
- WinterGoddess |
|
- frankenmerge |
|
- 120b |
|
- 32k |
|
--- |
|
# BigAurelian v0.5 120b 32k |
|
A Goliath-120b style frankenmerge of aurelian-v0.5-70b-32K and WinterGoddess-1.4x-70b. The goal is to have similar performance with an extended context size. **Important:** Use a positional embeddings compression factor (**compress_pos_emb**) of **8** when loading this model. |
|
|
|
# Prompting Format |
|
Llama2 and Alpaca. |
|
|
|
# Merge process |
|
The models used in the merge are [aurelian-v0.5-70b-32K](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-fp16) and [WinterGoddess-1.4x-70b](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2). |
|
|
|
The layer mix: |
|
```yaml |
|
- range 0, 16 |
|
aurelian |
|
- range 8, 24 |
|
WinterGoddess |
|
- range 17, 32 |
|
aurelian |
|
- range 25, 40 |
|
WinterGoddess |
|
- range 33, 48 |
|
aurelian |
|
- range 41, 56 |
|
WinterGoddess |
|
- range 49, 64 |
|
aurelian |
|
- range 57, 72 |
|
WinterGoddess |
|
- range 65, 80 |
|
aurelian |
|
``` |
|
|
|
# Acknowledgements |
|
[@grimulkan](https://huggingface.co/grimulkan) For creating aurelian-v0.5-70b-32K |
|
|
|
[@Sao10K](https://huggingface.co/Sao10K) For creating WinterGoddess |
|
|
|
[@alpindale](https://huggingface.co/alpindale) For creating the original Goliath |
|
|
|
[@chargoddard](https://huggingface.co/chargoddard) For developing [mergekit](https://github.com/cg123/mergekit). |
|
|