metadata
license: llama2
language:
- en
pipeline_tag: conversational
tags:
- aurelian
- WinterGoddess
- frankenmerge
- 120b
- 32k
BigAurelian v0.5 120b 32k
A Goliath-120b style frankenmerge of aurelian-v0.5-70b-32K and WinterGoddess-1.4x-70b. The goal is to have similar performance with an extended context size. Important: Use a positional embeddings compression factor (compress_pos_emb) of 8 when loading this model.
Prompting Format
Llama2 and Alpaca.
Merge process
The models used in the merge are aurelian-v0.5-70b-32K and WinterGoddess-1.4x-70b.
The layer mix:
- range 0, 16
aurelian
- range 8, 24
WinterGoddess
- range 17, 32
aurelian
- range 25, 40
WinterGoddess
- range 33, 48
aurelian
- range 41, 56
WinterGoddess
- range 49, 64
aurelian
- range 57, 72
WinterGoddess
- range 65, 80
aurelian
Acknowledgements
@grimulkan For creating grimulkan
@Sao10K For creating WinterGoddess
@alpindale For creating the original Goliath
@chargoddard For developing mergekit.