Update README.md
Browse files
README.md
CHANGED
@@ -1,199 +1,247 @@
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
---
|
5 |
|
6 |
-
#
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
[
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
[
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
[
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
[
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
[
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
-
|
115 |
-
|
116 |
-
|
117 |
-
|
118 |
-
|
119 |
-
|
120 |
-
|
121 |
-
|
122 |
-
|
123 |
-
|
124 |
-
|
125 |
-
|
126 |
-
|
127 |
-
|
128 |
-
|
129 |
-
|
130 |
-
|
131 |
-
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
|
136 |
-
|
137 |
-
|
138 |
-
|
139 |
-
|
140 |
-
|
141 |
-
|
142 |
-
|
143 |
-
|
144 |
-
|
145 |
-
|
146 |
-
|
147 |
-
-
|
148 |
-
|
149 |
-
-
|
150 |
-
-
|
151 |
-
-
|
152 |
-
|
153 |
-
|
154 |
-
|
155 |
-
|
156 |
-
|
157 |
-
|
158 |
-
|
159 |
-
|
160 |
-
|
161 |
-
|
162 |
-
|
163 |
-
|
164 |
-
|
165 |
-
|
166 |
-
|
167 |
-
|
168 |
-
|
169 |
-
|
170 |
-
|
171 |
-
|
172 |
-
|
173 |
-
|
174 |
-
|
175 |
-
|
176 |
-
|
177 |
-
|
178 |
-
|
179 |
-
|
180 |
-
|
181 |
-
|
182 |
-
|
183 |
-
|
184 |
-
|
185 |
-
|
186 |
-
|
187 |
-
|
188 |
-
|
189 |
-
|
190 |
-
|
191 |
-
|
192 |
-
|
193 |
-
|
194 |
-
|
195 |
-
|
196 |
-
|
197 |
-
|
198 |
-
|
199 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
+
language:
|
4 |
+
- ko
|
5 |
+
base_model:
|
6 |
+
- meta-llama/Llama-3.1-8B
|
7 |
+
- NCSOFT/Llama-VARCO-8B-Instruct
|
8 |
+
- akjindal53244/Llama-3.1-Storm-8B
|
9 |
+
pipeline_tag: text-generation
|
10 |
---
|
11 |
|
12 |
+
# 🤖 LLM Evolutionary Merge
|
13 |
+
|
14 |
+
🤗 [Model](https://huggingface.co/fiveflow/LLMEvoLLaMA-3.1-8B-v0.1) | 📂 [Github](https://github.com/kwon13/LLM-Evo-Merge) | ✍️ [Blog](작성중..) | 💡[Inspired by Sakana AI](https://github.com/SakanaAI/evolutionary-model-merge)
|
15 |
+
|
16 |
+
![robot](./assets/robot.jpeg)
|
17 |
+
This project aims to optimize model merging by integrating LLMs into evolutionary strategies in a novel way. Instead of using the [CMA-ES](https://en.wikipedia.org/wiki/CMA-ES) approach, the goal is to improve model optimization by [leveraging the search capabilities of LLMs](https://arxiv.org/abs/2402.18381) to explore the parameter space more efficiently and adjust the search scope based on high-performing solutions.
|
18 |
+
|
19 |
+
Currently, the project supports optimization only within the Parameter Space, but I plan to extend its functionality to enable merging and optimization in the Data Flow Space as well. This will further enhance model merging by optimizing the interaction between data flow and parameters.
|
20 |
+
|
21 |
+
## Performance
|
22 |
+
I focused on creating a high-performing Korean model solely through merging, without additional model training.
|
23 |
+
<details>
|
24 |
+
<summary>Merging Recipe</summary>
|
25 |
+
|
26 |
+
```YAML
|
27 |
+
base_model: meta-llama/Llama-3.1-8B
|
28 |
+
dtype: bfloat16
|
29 |
+
merge_method: task_arithmetic
|
30 |
+
allow_negative_weights: true
|
31 |
+
parameters:
|
32 |
+
int8_mask: 1.0
|
33 |
+
normalize: 1.0
|
34 |
+
slices:
|
35 |
+
- sources:
|
36 |
+
- layer_range: [0, 2]
|
37 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
38 |
+
parameters:
|
39 |
+
weight: 1
|
40 |
+
- layer_range: [0, 2]
|
41 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
42 |
+
parameters:
|
43 |
+
weight: 0.3475802891062396
|
44 |
+
- layer_range: [0, 2]
|
45 |
+
model: meta-llama/Llama-3.1-8B
|
46 |
+
|
47 |
+
- sources:
|
48 |
+
- layer_range: [2, 4]
|
49 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
50 |
+
parameters:
|
51 |
+
weight: 0.8971381657317269
|
52 |
+
- layer_range: [2, 4]
|
53 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
54 |
+
parameters:
|
55 |
+
weight: 0.45369921781118544
|
56 |
+
- layer_range: [2, 4]
|
57 |
+
model: meta-llama/Llama-3.1-8B
|
58 |
+
|
59 |
+
- sources:
|
60 |
+
- layer_range: [4, 6]
|
61 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
62 |
+
parameters:
|
63 |
+
weight: 0.5430828084884667
|
64 |
+
- layer_range: [4, 6]
|
65 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
66 |
+
parameters:
|
67 |
+
weight: 0.2834723715836387
|
68 |
+
- layer_range: [4, 6]
|
69 |
+
model: meta-llama/Llama-3.1-8B
|
70 |
+
|
71 |
+
- sources:
|
72 |
+
- layer_range: [6, 8]
|
73 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
74 |
+
parameters:
|
75 |
+
weight: 0.419043948030593
|
76 |
+
- layer_range: [6, 8]
|
77 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
78 |
+
parameters:
|
79 |
+
weight: 0.3705268601566145
|
80 |
+
- layer_range: [6, 8]
|
81 |
+
model: meta-llama/Llama-3.1-8B
|
82 |
+
|
83 |
+
- sources:
|
84 |
+
- layer_range: [8, 10]
|
85 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
86 |
+
parameters:
|
87 |
+
weight: 0.3813333860404775
|
88 |
+
- layer_range: [8, 10]
|
89 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
90 |
+
parameters:
|
91 |
+
weight: 0.7634501436288518
|
92 |
+
- layer_range: [8, 10]
|
93 |
+
model: meta-llama/Llama-3.1-8B
|
94 |
+
|
95 |
+
- sources:
|
96 |
+
- layer_range: [10, 12]
|
97 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
98 |
+
parameters:
|
99 |
+
weight: 0.49134830660275863
|
100 |
+
- layer_range: [10, 12]
|
101 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
102 |
+
parameters:
|
103 |
+
weight: 0.7211994938499454
|
104 |
+
- layer_range: [10, 12]
|
105 |
+
model: meta-llama/Llama-3.1-8B
|
106 |
+
|
107 |
+
- sources:
|
108 |
+
- layer_range: [12, 14]
|
109 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
110 |
+
parameters:
|
111 |
+
weight: 0.9218963071448836
|
112 |
+
- layer_range: [12, 14]
|
113 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
114 |
+
parameters:
|
115 |
+
weight: 0.5117022419864319
|
116 |
+
- layer_range: [12, 14]
|
117 |
+
model: meta-llama/Llama-3.1-8B
|
118 |
+
|
119 |
+
- sources:
|
120 |
+
- layer_range: [14, 16]
|
121 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
122 |
+
parameters:
|
123 |
+
weight: 0.8238938467581831
|
124 |
+
- layer_range: [14, 16]
|
125 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
126 |
+
parameters:
|
127 |
+
weight: 0.851712316016478
|
128 |
+
- layer_range: [14, 16]
|
129 |
+
model: meta-llama/Llama-3.1-8B
|
130 |
+
|
131 |
+
- sources:
|
132 |
+
- layer_range: [16, 18]
|
133 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
134 |
+
parameters:
|
135 |
+
weight: 0.3543028846914006
|
136 |
+
- layer_range: [16, 18]
|
137 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
138 |
+
parameters:
|
139 |
+
weight: 0.6864368345788241
|
140 |
+
- layer_range: [16, 18]
|
141 |
+
model: meta-llama/Llama-3.1-8B
|
142 |
+
|
143 |
+
- sources:
|
144 |
+
- layer_range: [18, 20]
|
145 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
146 |
+
parameters:
|
147 |
+
weight: 0.9189961100847883
|
148 |
+
- layer_range: [18, 20]
|
149 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
150 |
+
parameters:
|
151 |
+
weight: 0.5800251781306379
|
152 |
+
- layer_range: [18, 20]
|
153 |
+
model: meta-llama/Llama-3.1-8B
|
154 |
+
|
155 |
+
- sources:
|
156 |
+
- layer_range: [20, 22]
|
157 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
158 |
+
parameters:
|
159 |
+
weight: 0.9281691677008521
|
160 |
+
- layer_range: [20, 22]
|
161 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
162 |
+
parameters:
|
163 |
+
weight: 0.5356892784211416
|
164 |
+
- layer_range: [20, 22]
|
165 |
+
model: meta-llama/Llama-3.1-8B
|
166 |
+
|
167 |
+
- sources:
|
168 |
+
- layer_range: [22, 24]
|
169 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
170 |
+
parameters:
|
171 |
+
weight: 0.839268407952539
|
172 |
+
- layer_range: [22, 24]
|
173 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
174 |
+
parameters:
|
175 |
+
weight: 0.5082186376599986
|
176 |
+
- layer_range: [22, 24]
|
177 |
+
model: meta-llama/Llama-3.1-8B
|
178 |
+
|
179 |
+
- sources:
|
180 |
+
- layer_range: [24, 26]
|
181 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
182 |
+
parameters:
|
183 |
+
weight: 0.6241902192095534
|
184 |
+
- layer_range: [24, 26]
|
185 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
186 |
+
parameters:
|
187 |
+
weight: 0.2945221540685877
|
188 |
+
- layer_range: [24, 26]
|
189 |
+
model: meta-llama/Llama-3.1-8B
|
190 |
+
|
191 |
+
- sources:
|
192 |
+
- layer_range: [26, 28]
|
193 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
194 |
+
parameters:
|
195 |
+
weight: 0.7030728026501202
|
196 |
+
- layer_range: [26, 28]
|
197 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
198 |
+
parameters:
|
199 |
+
weight: 0.2350478509634181
|
200 |
+
- layer_range: [26, 28]
|
201 |
+
model: meta-llama/Llama-3.1-8B
|
202 |
+
|
203 |
+
- sources:
|
204 |
+
- layer_range: [28, 30]
|
205 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
206 |
+
parameters:
|
207 |
+
weight: 0.2590342230366074
|
208 |
+
- layer_range: [28, 30]
|
209 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
210 |
+
parameters:
|
211 |
+
weight: 0.006083182855312869
|
212 |
+
- layer_range: [28, 30]
|
213 |
+
model: meta-llama/Llama-3.1-8B
|
214 |
+
|
215 |
+
- sources:
|
216 |
+
- layer_range: [30, 32]
|
217 |
+
model: NCSOFT/Llama-VARCO-8B-Instruct
|
218 |
+
parameters:
|
219 |
+
weight: 1
|
220 |
+
- layer_range: [30, 32]
|
221 |
+
model: akjindal53244/Llama-3.1-Storm-8B
|
222 |
+
parameters:
|
223 |
+
weight: 0.234650395825126
|
224 |
+
- layer_range: [30, 32]
|
225 |
+
model: meta-llama/Llama-3.1-8B
|
226 |
+
```
|
227 |
+
</details>
|
228 |
+
|
229 |
+
The models used for merging are listed below.
|
230 |
+
```
|
231 |
+
Base Model: meta-llama/Llama-3.1-8B
|
232 |
+
Model 1: NCSOFT/Llama-VARCO-8B-Instruct
|
233 |
+
Model 2: akjindal53244/Llama-3.1-Storm-8B
|
234 |
+
```
|
235 |
+
### Comparing LLMEvoLlama with Source in Korean Benchmark
|
236 |
+
![korean_performance](./assets/output.png)
|
237 |
+
- LogicKor: A benchmark that evaluates various linguistic abilities in Korean, including math, writing, coding, comprehension, grammar, and reasoning skills. (https://lk.instruct.kr/)
|
238 |
+
|
239 |
+
- KoBest: A benchmark consisting of five natural language understanding tasks designed to test advanced Korean language comprehension. (https://arxiv.org/abs/2204.04541)
|
240 |
+
|
241 |
+
### Comparing LLMEvoLlama with Source in English Benchmark and Total Average
|
242 |
+
| Model | truthfulqa_mc2 (0-shot acc) | arc_challenge (0-shot acc) | Korean + English Performance (avg) |
|
243 |
+
|-----------------|-------------------------|------------------------|------------------------------|
|
244 |
+
| [VARCO](https://huggingface.co/NCSOFT/Llama-VARCO-8B-Instruct) | 0.53 | 0.47 | 0.68 |
|
245 |
+
| [Llama-Instruct](https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct) | 0.53 | 0.52 | 0.66 |
|
246 |
+
| [Llama-Storm](https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B) | 0.59 | 0.52 | 0.67 |
|
247 |
+
| [LLMEvoLLaMA](https://huggingface.co/fiveflow/LLMEvoLLaMA-3.1-8B-v0.1) | 0.57 | 0.50 | **0.71** |
|