Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
# BETTER THAN GOLIATH?!
|
2 |
I've merged [Euryale-lora that I made](https://huggingface.co/ChuckMcSneed/Euryale-1.3-L2-70B-LORA) with [Xwin](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) and then merged it with itself in [goliath-style merge](/config.yml) using [mergekit](https://github.com/arcee-ai/mergekit). The resulting model performs better than [goliath](https://huggingface.co/alpindale/goliath-120b) on my tests(note: performance on tests is not necessarily performance in practice). Test it, have fun with it. This is a sister model of [Premerge-EX-EX-123B](https://huggingface.co/ChuckMcSneed/Premerge-EX-EX-123B).
|
3 |
# Ideas behind it
|
@@ -40,4 +46,4 @@ Potentially in the future we can get better models by controlled merging of LORA
|
|
40 |
|Sao10K/Euryale-1.3-L2-70B |Q6_K |70B |0 |2 |0 |3 |5 |10 |2 |8 |
|
41 |
|Sao10K/Euryale-1.3-L2-70B+xwin-lora |Q6_K |70B |2 |2 |1 |5.5|5.5 |16 |5 |11 |
|
42 |
|Xwin-LM/Xwin-LM-70B-V0.1 |Q6_K |70B |0 |1 |2 |5.5|5.25|13.75|3 |10.75|
|
43 |
-
|Xwin-LM/Xwin-LM-70B-V0.1+euryale-lora|Q6_K |70B |3 |2 |2 |6 |5 |18 |7 |11 |
|
|
|
1 |
+
---
|
2 |
+
license: llama2
|
3 |
+
tags:
|
4 |
+
- merge
|
5 |
+
- mergekit
|
6 |
+
---
|
7 |
# BETTER THAN GOLIATH?!
|
8 |
I've merged [Euryale-lora that I made](https://huggingface.co/ChuckMcSneed/Euryale-1.3-L2-70B-LORA) with [Xwin](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) and then merged it with itself in [goliath-style merge](/config.yml) using [mergekit](https://github.com/arcee-ai/mergekit). The resulting model performs better than [goliath](https://huggingface.co/alpindale/goliath-120b) on my tests(note: performance on tests is not necessarily performance in practice). Test it, have fun with it. This is a sister model of [Premerge-EX-EX-123B](https://huggingface.co/ChuckMcSneed/Premerge-EX-EX-123B).
|
9 |
# Ideas behind it
|
|
|
46 |
|Sao10K/Euryale-1.3-L2-70B |Q6_K |70B |0 |2 |0 |3 |5 |10 |2 |8 |
|
47 |
|Sao10K/Euryale-1.3-L2-70B+xwin-lora |Q6_K |70B |2 |2 |1 |5.5|5.5 |16 |5 |11 |
|
48 |
|Xwin-LM/Xwin-LM-70B-V0.1 |Q6_K |70B |0 |1 |2 |5.5|5.25|13.75|3 |10.75|
|
49 |
+
|Xwin-LM/Xwin-LM-70B-V0.1+euryale-lora|Q6_K |70B |3 |2 |2 |6 |5 |18 |7 |11 |
|