BigWeave Viable
Collection
All viable models from the BigWeave series, excluding quantized versions
•
7 items
•
Updated
A Goliath-120b style frankenmerge of Xwin-LM-70b-v0.1 and Euryale-1.3-70b. The goal is to find other merge combinations that work well.
The version number is for me to keep track of the merges, only results that seem to work reasonably well are kept/published.
Vicuna and Alpaca.
The models used in the merge are Xwin-LM-70b-v0.1 and Euryale-1.3-70b.
The layer mix:
- range 0, 12
Xwin
- range 9, 14
Euryale
- range 12, 62
Xwin
- range 54, 71
Euryale
- range 62, 80
Xwin
@Xwin-LM For creating Xwin
@Sao10K For creating Euryale
@alpindale For creating the original Goliath
@chargoddard For developing mergekit.
Detailed results can be found here
Metric | Value |
---|---|
Avg. | 67.47 |
AI2 Reasoning Challenge (25-Shot) | 65.36 |
HellaSwag (10-Shot) | 87.21 |
MMLU (5-Shot) | 68.04 |
TruthfulQA (0-shot) | 57.96 |
Winogrande (5-shot) | 81.69 |
GSM8k (5-shot) | 44.58 |