strongpear commited on
Commit
f343e8f
1 Parent(s): 6219957

strongpear/llama3.1-8B_finetune_genQA_wiki_r64

Browse files
Files changed (2) hide show
  1. README.md +411 -0
  2. adapter_model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,411 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: meta-llama/Llama-3.1-8B
3
+ library_name: peft
4
+ license: llama3.1
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: llama3.1-8B_finetune_genQA_wiki_r64_v2
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # llama3.1-8B_finetune_genQA_wiki_r64_v2
16
+
17
+ This model is a fine-tuned version of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.3744
20
+
21
+ ## Model description
22
+
23
+ More information needed
24
+
25
+ ## Intended uses & limitations
26
+
27
+ More information needed
28
+
29
+ ## Training and evaluation data
30
+
31
+ More information needed
32
+
33
+ ## Training procedure
34
+
35
+ ### Training hyperparameters
36
+
37
+ The following hyperparameters were used during training:
38
+ - learning_rate: 3.6e-05
39
+ - train_batch_size: 4
40
+ - eval_batch_size: 4
41
+ - seed: 42
42
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
+ - lr_scheduler_type: linear
44
+ - num_epochs: 1
45
+ - mixed_precision_training: Native AMP
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss |
50
+ |:-------------:|:------:|:-----:|:---------------:|
51
+ | 0.5552 | 0.0028 | 200 | 0.4789 |
52
+ | 0.3971 | 0.0057 | 400 | 0.4757 |
53
+ | 0.576 | 0.0085 | 600 | 0.4745 |
54
+ | 0.5104 | 0.0113 | 800 | 0.4738 |
55
+ | 0.6768 | 0.0142 | 1000 | 0.4725 |
56
+ | 0.4755 | 0.0170 | 1200 | 0.4715 |
57
+ | 0.304 | 0.0199 | 1400 | 0.4706 |
58
+ | 0.5975 | 0.0227 | 1600 | 0.4703 |
59
+ | 0.4878 | 0.0255 | 1800 | 0.4692 |
60
+ | 0.3835 | 0.0284 | 2000 | 0.4684 |
61
+ | 0.5211 | 0.0312 | 2200 | 0.4683 |
62
+ | 0.4565 | 0.0340 | 2400 | 0.4673 |
63
+ | 0.4741 | 0.0369 | 2600 | 0.4671 |
64
+ | 0.4993 | 0.0397 | 2800 | 0.4664 |
65
+ | 0.2234 | 0.0426 | 3000 | 0.4664 |
66
+ | 0.5535 | 0.0454 | 3200 | 0.4654 |
67
+ | 0.4021 | 0.0482 | 3400 | 0.4649 |
68
+ | 0.5119 | 0.0511 | 3600 | 0.4640 |
69
+ | 0.4791 | 0.0539 | 3800 | 0.4634 |
70
+ | 0.3927 | 0.0567 | 4000 | 0.4619 |
71
+ | 0.5627 | 0.0596 | 4200 | 0.4614 |
72
+ | 0.3563 | 0.0624 | 4400 | 0.4608 |
73
+ | 0.4719 | 0.0652 | 4600 | 0.4604 |
74
+ | 0.562 | 0.0681 | 4800 | 0.4593 |
75
+ | 0.5258 | 0.0709 | 5000 | 0.4591 |
76
+ | 0.4483 | 0.0738 | 5200 | 0.4583 |
77
+ | 0.4827 | 0.0766 | 5400 | 0.4572 |
78
+ | 0.3712 | 0.0794 | 5600 | 0.4577 |
79
+ | 0.3198 | 0.0823 | 5800 | 0.4565 |
80
+ | 0.4168 | 0.0851 | 6000 | 0.4567 |
81
+ | 0.4166 | 0.0879 | 6200 | 0.4566 |
82
+ | 0.3014 | 0.0908 | 6400 | 0.4557 |
83
+ | 0.397 | 0.0936 | 6600 | 0.4550 |
84
+ | 0.4803 | 0.0965 | 6800 | 0.4540 |
85
+ | 0.4119 | 0.0993 | 7000 | 0.4540 |
86
+ | 0.5685 | 0.1021 | 7200 | 0.4532 |
87
+ | 0.4883 | 0.1050 | 7400 | 0.4532 |
88
+ | 0.5795 | 0.1078 | 7600 | 0.4531 |
89
+ | 0.5066 | 0.1106 | 7800 | 0.4524 |
90
+ | 0.4414 | 0.1135 | 8000 | 0.4522 |
91
+ | 0.3382 | 0.1163 | 8200 | 0.4520 |
92
+ | 0.5088 | 0.1191 | 8400 | 0.4521 |
93
+ | 0.4083 | 0.1220 | 8600 | 0.4523 |
94
+ | 0.437 | 0.1248 | 8800 | 0.4518 |
95
+ | 0.3769 | 0.1277 | 9000 | 0.4513 |
96
+ | 0.3828 | 0.1305 | 9200 | 0.4513 |
97
+ | 0.5162 | 0.1333 | 9400 | 0.4506 |
98
+ | 0.3603 | 0.1362 | 9600 | 0.4501 |
99
+ | 0.6039 | 0.1390 | 9800 | 0.4500 |
100
+ | 0.3339 | 0.1418 | 10000 | 0.4493 |
101
+ | 0.3821 | 0.1447 | 10200 | 0.4491 |
102
+ | 0.478 | 0.1475 | 10400 | 0.4488 |
103
+ | 0.3994 | 0.1504 | 10600 | 0.4481 |
104
+ | 0.4471 | 0.1532 | 10800 | 0.4474 |
105
+ | 0.4102 | 0.1560 | 11000 | 0.4471 |
106
+ | 0.3164 | 0.1589 | 11200 | 0.4464 |
107
+ | 0.3487 | 0.1617 | 11400 | 0.4458 |
108
+ | 0.4359 | 0.1645 | 11600 | 0.4455 |
109
+ | 0.5473 | 0.1674 | 11800 | 0.4450 |
110
+ | 0.5629 | 0.1702 | 12000 | 0.4452 |
111
+ | 0.5864 | 0.1730 | 12200 | 0.4446 |
112
+ | 0.4339 | 0.1759 | 12400 | 0.4442 |
113
+ | 0.3727 | 0.1787 | 12600 | 0.4447 |
114
+ | 0.4229 | 0.1816 | 12800 | 0.4439 |
115
+ | 0.4888 | 0.1844 | 13000 | 0.4437 |
116
+ | 0.4252 | 0.1872 | 13200 | 0.4439 |
117
+ | 0.5414 | 0.1901 | 13400 | 0.4432 |
118
+ | 0.4577 | 0.1929 | 13600 | 0.4429 |
119
+ | 0.4698 | 0.1957 | 13800 | 0.4425 |
120
+ | 0.488 | 0.1986 | 14000 | 0.4423 |
121
+ | 0.3836 | 0.2014 | 14200 | 0.4419 |
122
+ | 0.4724 | 0.2043 | 14400 | 0.4415 |
123
+ | 0.4451 | 0.2071 | 14600 | 0.4412 |
124
+ | 0.4409 | 0.2099 | 14800 | 0.4408 |
125
+ | 0.4377 | 0.2128 | 15000 | 0.4410 |
126
+ | 0.4688 | 0.2156 | 15200 | 0.4408 |
127
+ | 0.4975 | 0.2184 | 15400 | 0.4406 |
128
+ | 0.5752 | 0.2213 | 15600 | 0.4405 |
129
+ | 0.4824 | 0.2241 | 15800 | 0.4399 |
130
+ | 0.379 | 0.2270 | 16000 | 0.4396 |
131
+ | 0.5237 | 0.2298 | 16200 | 0.4396 |
132
+ | 0.491 | 0.2326 | 16400 | 0.4389 |
133
+ | 0.4132 | 0.2355 | 16600 | 0.4388 |
134
+ | 0.4417 | 0.2383 | 16800 | 0.4385 |
135
+ | 0.4859 | 0.2411 | 17000 | 0.4380 |
136
+ | 0.4579 | 0.2440 | 17200 | 0.4377 |
137
+ | 0.5186 | 0.2468 | 17400 | 0.4366 |
138
+ | 0.4055 | 0.2496 | 17600 | 0.4362 |
139
+ | 0.5011 | 0.2525 | 17800 | 0.4359 |
140
+ | 0.3842 | 0.2553 | 18000 | 0.4350 |
141
+ | 0.5457 | 0.2582 | 18200 | 0.4352 |
142
+ | 0.5447 | 0.2610 | 18400 | 0.4350 |
143
+ | 0.4722 | 0.2638 | 18600 | 0.4349 |
144
+ | 0.4229 | 0.2667 | 18800 | 0.4340 |
145
+ | 0.5047 | 0.2695 | 19000 | 0.4338 |
146
+ | 0.2641 | 0.2723 | 19200 | 0.4340 |
147
+ | 0.3937 | 0.2752 | 19400 | 0.4333 |
148
+ | 0.4082 | 0.2780 | 19600 | 0.4332 |
149
+ | 0.3958 | 0.2809 | 19800 | 0.4317 |
150
+ | 0.4929 | 0.2837 | 20000 | 0.4315 |
151
+ | 0.4739 | 0.2865 | 20200 | 0.4313 |
152
+ | 0.5014 | 0.2894 | 20400 | 0.4316 |
153
+ | 0.3236 | 0.2922 | 20600 | 0.4308 |
154
+ | 0.3615 | 0.2950 | 20800 | 0.4299 |
155
+ | 0.4235 | 0.2979 | 21000 | 0.4297 |
156
+ | 0.2811 | 0.3007 | 21200 | 0.4295 |
157
+ | 0.2743 | 0.3035 | 21400 | 0.4287 |
158
+ | 0.4718 | 0.3064 | 21600 | 0.4290 |
159
+ | 0.5184 | 0.3092 | 21800 | 0.4289 |
160
+ | 0.3954 | 0.3121 | 22000 | 0.4289 |
161
+ | 0.4478 | 0.3149 | 22200 | 0.4288 |
162
+ | 0.4001 | 0.3177 | 22400 | 0.4288 |
163
+ | 0.4768 | 0.3206 | 22600 | 0.4288 |
164
+ | 0.3908 | 0.3234 | 22800 | 0.4284 |
165
+ | 0.4068 | 0.3262 | 23000 | 0.4280 |
166
+ | 0.3719 | 0.3291 | 23200 | 0.4270 |
167
+ | 0.3995 | 0.3319 | 23400 | 0.4266 |
168
+ | 0.4877 | 0.3348 | 23600 | 0.4262 |
169
+ | 0.5297 | 0.3376 | 23800 | 0.4262 |
170
+ | 0.4355 | 0.3404 | 24000 | 0.4259 |
171
+ | 0.4551 | 0.3433 | 24200 | 0.4255 |
172
+ | 0.4737 | 0.3461 | 24400 | 0.4253 |
173
+ | 0.3971 | 0.3489 | 24600 | 0.4252 |
174
+ | 0.3989 | 0.3518 | 24800 | 0.4247 |
175
+ | 0.5214 | 0.3546 | 25000 | 0.4246 |
176
+ | 0.4378 | 0.3574 | 25200 | 0.4244 |
177
+ | 0.5113 | 0.3603 | 25400 | 0.4246 |
178
+ | 0.4544 | 0.3631 | 25600 | 0.4246 |
179
+ | 0.4151 | 0.3660 | 25800 | 0.4244 |
180
+ | 0.3019 | 0.3688 | 26000 | 0.4243 |
181
+ | 0.4397 | 0.3716 | 26200 | 0.4242 |
182
+ | 0.5191 | 0.3745 | 26400 | 0.4238 |
183
+ | 0.3183 | 0.3773 | 26600 | 0.4237 |
184
+ | 0.203 | 0.3801 | 26800 | 0.4236 |
185
+ | 0.522 | 0.3830 | 27000 | 0.4231 |
186
+ | 0.4159 | 0.3858 | 27200 | 0.4232 |
187
+ | 0.348 | 0.3887 | 27400 | 0.4231 |
188
+ | 0.5155 | 0.3915 | 27600 | 0.4229 |
189
+ | 0.5407 | 0.3943 | 27800 | 0.4226 |
190
+ | 0.4152 | 0.3972 | 28000 | 0.4219 |
191
+ | 0.4148 | 0.4 | 28200 | 0.4213 |
192
+ | 0.5514 | 0.4028 | 28400 | 0.4206 |
193
+ | 0.4132 | 0.4057 | 28600 | 0.4201 |
194
+ | 0.4249 | 0.4085 | 28800 | 0.4199 |
195
+ | 0.4003 | 0.4113 | 29000 | 0.4199 |
196
+ | 0.3864 | 0.4142 | 29200 | 0.4194 |
197
+ | 0.3573 | 0.4170 | 29400 | 0.4192 |
198
+ | 0.4098 | 0.4199 | 29600 | 0.4185 |
199
+ | 0.3843 | 0.4227 | 29800 | 0.4187 |
200
+ | 0.5306 | 0.4255 | 30000 | 0.4187 |
201
+ | 0.2888 | 0.4284 | 30200 | 0.4183 |
202
+ | 0.2458 | 0.4312 | 30400 | 0.4176 |
203
+ | 0.3408 | 0.4340 | 30600 | 0.4169 |
204
+ | 0.4499 | 0.4369 | 30800 | 0.4162 |
205
+ | 0.4447 | 0.4397 | 31000 | 0.4157 |
206
+ | 0.523 | 0.4426 | 31200 | 0.4160 |
207
+ | 0.47 | 0.4454 | 31400 | 0.4160 |
208
+ | 0.3046 | 0.4482 | 31600 | 0.4155 |
209
+ | 0.4464 | 0.4511 | 31800 | 0.4156 |
210
+ | 0.4347 | 0.4539 | 32000 | 0.4152 |
211
+ | 0.5663 | 0.4567 | 32200 | 0.4152 |
212
+ | 0.4236 | 0.4596 | 32400 | 0.4150 |
213
+ | 0.4682 | 0.4624 | 32600 | 0.4143 |
214
+ | 0.2078 | 0.4652 | 32800 | 0.4138 |
215
+ | 0.4289 | 0.4681 | 33000 | 0.4135 |
216
+ | 0.4123 | 0.4709 | 33200 | 0.4134 |
217
+ | 0.2913 | 0.4738 | 33400 | 0.4132 |
218
+ | 0.3788 | 0.4766 | 33600 | 0.4131 |
219
+ | 0.3111 | 0.4794 | 33800 | 0.4132 |
220
+ | 0.5047 | 0.4823 | 34000 | 0.4124 |
221
+ | 0.3096 | 0.4851 | 34200 | 0.4124 |
222
+ | 0.5126 | 0.4879 | 34400 | 0.4121 |
223
+ | 0.5116 | 0.4908 | 34600 | 0.4117 |
224
+ | 0.3009 | 0.4936 | 34800 | 0.4115 |
225
+ | 0.5036 | 0.4965 | 35000 | 0.4108 |
226
+ | 0.4221 | 0.4993 | 35200 | 0.4109 |
227
+ | 0.5021 | 0.5021 | 35400 | 0.4109 |
228
+ | 0.2946 | 0.5050 | 35600 | 0.4108 |
229
+ | 0.4487 | 0.5078 | 35800 | 0.4104 |
230
+ | 0.3863 | 0.5106 | 36000 | 0.4103 |
231
+ | 0.3043 | 0.5135 | 36200 | 0.4097 |
232
+ | 0.5039 | 0.5163 | 36400 | 0.4092 |
233
+ | 0.537 | 0.5191 | 36600 | 0.4087 |
234
+ | 0.3525 | 0.5220 | 36800 | 0.4082 |
235
+ | 0.3099 | 0.5248 | 37000 | 0.4081 |
236
+ | 0.4568 | 0.5277 | 37200 | 0.4080 |
237
+ | 0.1907 | 0.5305 | 37400 | 0.4079 |
238
+ | 0.5096 | 0.5333 | 37600 | 0.4074 |
239
+ | 0.4411 | 0.5362 | 37800 | 0.4068 |
240
+ | 0.4075 | 0.5390 | 38000 | 0.4070 |
241
+ | 0.5252 | 0.5418 | 38200 | 0.4067 |
242
+ | 0.2606 | 0.5447 | 38400 | 0.4060 |
243
+ | 0.4556 | 0.5475 | 38600 | 0.4060 |
244
+ | 0.3456 | 0.5504 | 38800 | 0.4056 |
245
+ | 0.2922 | 0.5532 | 39000 | 0.4050 |
246
+ | 0.4582 | 0.5560 | 39200 | 0.4043 |
247
+ | 0.3284 | 0.5589 | 39400 | 0.4039 |
248
+ | 0.511 | 0.5617 | 39600 | 0.4035 |
249
+ | 0.4445 | 0.5645 | 39800 | 0.4035 |
250
+ | 0.2857 | 0.5674 | 40000 | 0.4035 |
251
+ | 0.4778 | 0.5702 | 40200 | 0.4030 |
252
+ | 0.4949 | 0.5730 | 40400 | 0.4025 |
253
+ | 0.4691 | 0.5759 | 40600 | 0.4021 |
254
+ | 0.6142 | 0.5787 | 40800 | 0.4022 |
255
+ | 0.2999 | 0.5816 | 41000 | 0.4011 |
256
+ | 0.4709 | 0.5844 | 41200 | 0.4010 |
257
+ | 0.3927 | 0.5872 | 41400 | 0.4004 |
258
+ | 0.4305 | 0.5901 | 41600 | 0.4001 |
259
+ | 0.4798 | 0.5929 | 41800 | 0.3988 |
260
+ | 0.5321 | 0.5957 | 42000 | 0.3988 |
261
+ | 0.5152 | 0.5986 | 42200 | 0.3987 |
262
+ | 0.3418 | 0.6014 | 42400 | 0.3984 |
263
+ | 0.4056 | 0.6043 | 42600 | 0.3984 |
264
+ | 0.2357 | 0.6071 | 42800 | 0.3983 |
265
+ | 0.3883 | 0.6099 | 43000 | 0.3980 |
266
+ | 0.4356 | 0.6128 | 43200 | 0.3978 |
267
+ | 0.3481 | 0.6156 | 43400 | 0.3977 |
268
+ | 0.3969 | 0.6184 | 43600 | 0.3971 |
269
+ | 0.459 | 0.6213 | 43800 | 0.3969 |
270
+ | 0.3185 | 0.6241 | 44000 | 0.3969 |
271
+ | 0.4342 | 0.6270 | 44200 | 0.3963 |
272
+ | 0.448 | 0.6298 | 44400 | 0.3961 |
273
+ | 0.4198 | 0.6326 | 44600 | 0.3958 |
274
+ | 0.3038 | 0.6355 | 44800 | 0.3957 |
275
+ | 0.3628 | 0.6383 | 45000 | 0.3953 |
276
+ | 0.4918 | 0.6411 | 45200 | 0.3954 |
277
+ | 0.5577 | 0.6440 | 45400 | 0.3953 |
278
+ | 0.2738 | 0.6468 | 45600 | 0.3952 |
279
+ | 0.481 | 0.6496 | 45800 | 0.3948 |
280
+ | 0.4624 | 0.6525 | 46000 | 0.3945 |
281
+ | 0.4184 | 0.6553 | 46200 | 0.3943 |
282
+ | 0.5 | 0.6582 | 46400 | 0.3942 |
283
+ | 0.3874 | 0.6610 | 46600 | 0.3941 |
284
+ | 0.4107 | 0.6638 | 46800 | 0.3939 |
285
+ | 0.4462 | 0.6667 | 47000 | 0.3938 |
286
+ | 0.4787 | 0.6695 | 47200 | 0.3934 |
287
+ | 0.2831 | 0.6723 | 47400 | 0.3932 |
288
+ | 0.3612 | 0.6752 | 47600 | 0.3933 |
289
+ | 0.2958 | 0.6780 | 47800 | 0.3928 |
290
+ | 0.2881 | 0.6809 | 48000 | 0.3926 |
291
+ | 0.3764 | 0.6837 | 48200 | 0.3927 |
292
+ | 0.3935 | 0.6865 | 48400 | 0.3926 |
293
+ | 0.448 | 0.6894 | 48600 | 0.3923 |
294
+ | 0.4448 | 0.6922 | 48800 | 0.3918 |
295
+ | 0.3697 | 0.6950 | 49000 | 0.3918 |
296
+ | 0.4227 | 0.6979 | 49200 | 0.3917 |
297
+ | 0.4456 | 0.7007 | 49400 | 0.3914 |
298
+ | 0.4413 | 0.7035 | 49600 | 0.3913 |
299
+ | 0.466 | 0.7064 | 49800 | 0.3912 |
300
+ | 0.2918 | 0.7092 | 50000 | 0.3910 |
301
+ | 0.3252 | 0.7121 | 50200 | 0.3908 |
302
+ | 0.198 | 0.7149 | 50400 | 0.3905 |
303
+ | 0.2525 | 0.7177 | 50600 | 0.3903 |
304
+ | 0.483 | 0.7206 | 50800 | 0.3901 |
305
+ | 0.458 | 0.7234 | 51000 | 0.3898 |
306
+ | 0.4184 | 0.7262 | 51200 | 0.3898 |
307
+ | 0.385 | 0.7291 | 51400 | 0.3894 |
308
+ | 0.3237 | 0.7319 | 51600 | 0.3893 |
309
+ | 0.3709 | 0.7348 | 51800 | 0.3889 |
310
+ | 0.2904 | 0.7376 | 52000 | 0.3890 |
311
+ | 0.362 | 0.7404 | 52200 | 0.3888 |
312
+ | 0.4254 | 0.7433 | 52400 | 0.3885 |
313
+ | 0.4099 | 0.7461 | 52600 | 0.3884 |
314
+ | 0.2335 | 0.7489 | 52800 | 0.3881 |
315
+ | 0.4477 | 0.7518 | 53000 | 0.3880 |
316
+ | 0.3345 | 0.7546 | 53200 | 0.3878 |
317
+ | 0.3674 | 0.7574 | 53400 | 0.3873 |
318
+ | 0.435 | 0.7603 | 53600 | 0.3874 |
319
+ | 0.4013 | 0.7631 | 53800 | 0.3873 |
320
+ | 0.3666 | 0.7660 | 54000 | 0.3868 |
321
+ | 0.3511 | 0.7688 | 54200 | 0.3864 |
322
+ | 0.4599 | 0.7716 | 54400 | 0.3862 |
323
+ | 0.4172 | 0.7745 | 54600 | 0.3859 |
324
+ | 0.3722 | 0.7773 | 54800 | 0.3857 |
325
+ | 0.459 | 0.7801 | 55000 | 0.3853 |
326
+ | 0.3358 | 0.7830 | 55200 | 0.3848 |
327
+ | 0.3544 | 0.7858 | 55400 | 0.3847 |
328
+ | 0.3254 | 0.7887 | 55600 | 0.3846 |
329
+ | 0.443 | 0.7915 | 55800 | 0.3845 |
330
+ | 0.1945 | 0.7943 | 56000 | 0.3844 |
331
+ | 0.4167 | 0.7972 | 56200 | 0.3842 |
332
+ | 0.4281 | 0.8 | 56400 | 0.3838 |
333
+ | 0.4496 | 0.8028 | 56600 | 0.3836 |
334
+ | 0.4004 | 0.8057 | 56800 | 0.3834 |
335
+ | 0.4789 | 0.8085 | 57000 | 0.3832 |
336
+ | 0.37 | 0.8113 | 57200 | 0.3831 |
337
+ | 0.3972 | 0.8142 | 57400 | 0.3830 |
338
+ | 0.4429 | 0.8170 | 57600 | 0.3827 |
339
+ | 0.5432 | 0.8199 | 57800 | 0.3825 |
340
+ | 0.4448 | 0.8227 | 58000 | 0.3825 |
341
+ | 0.4668 | 0.8255 | 58200 | 0.3820 |
342
+ | 0.3144 | 0.8284 | 58400 | 0.3815 |
343
+ | 0.3173 | 0.8312 | 58600 | 0.3812 |
344
+ | 0.4206 | 0.8340 | 58800 | 0.3811 |
345
+ | 0.3072 | 0.8369 | 59000 | 0.3810 |
346
+ | 0.3854 | 0.8397 | 59200 | 0.3806 |
347
+ | 0.3892 | 0.8426 | 59400 | 0.3805 |
348
+ | 0.4497 | 0.8454 | 59600 | 0.3805 |
349
+ | 0.4165 | 0.8482 | 59800 | 0.3805 |
350
+ | 0.3679 | 0.8511 | 60000 | 0.3802 |
351
+ | 0.4221 | 0.8539 | 60200 | 0.3796 |
352
+ | 0.4912 | 0.8567 | 60400 | 0.3796 |
353
+ | 0.4141 | 0.8596 | 60600 | 0.3794 |
354
+ | 0.3999 | 0.8624 | 60800 | 0.3792 |
355
+ | 0.3014 | 0.8652 | 61000 | 0.3788 |
356
+ | 0.5035 | 0.8681 | 61200 | 0.3787 |
357
+ | 0.4158 | 0.8709 | 61400 | 0.3784 |
358
+ | 0.3597 | 0.8738 | 61600 | 0.3783 |
359
+ | 0.4084 | 0.8766 | 61800 | 0.3782 |
360
+ | 0.3475 | 0.8794 | 62000 | 0.3782 |
361
+ | 0.3177 | 0.8823 | 62200 | 0.3781 |
362
+ | 0.3705 | 0.8851 | 62400 | 0.3778 |
363
+ | 0.4659 | 0.8879 | 62600 | 0.3776 |
364
+ | 0.4417 | 0.8908 | 62800 | 0.3774 |
365
+ | 0.435 | 0.8936 | 63000 | 0.3772 |
366
+ | 0.4289 | 0.8965 | 63200 | 0.3772 |
367
+ | 0.3551 | 0.8993 | 63400 | 0.3770 |
368
+ | 0.4331 | 0.9021 | 63600 | 0.3770 |
369
+ | 0.3531 | 0.9050 | 63800 | 0.3768 |
370
+ | 0.4663 | 0.9078 | 64000 | 0.3767 |
371
+ | 0.4011 | 0.9106 | 64200 | 0.3765 |
372
+ | 0.344 | 0.9135 | 64400 | 0.3763 |
373
+ | 0.3488 | 0.9163 | 64600 | 0.3762 |
374
+ | 0.2881 | 0.9191 | 64800 | 0.3761 |
375
+ | 0.4809 | 0.9220 | 65000 | 0.3760 |
376
+ | 0.4229 | 0.9248 | 65200 | 0.3759 |
377
+ | 0.4683 | 0.9277 | 65400 | 0.3757 |
378
+ | 0.483 | 0.9305 | 65600 | 0.3756 |
379
+ | 0.4342 | 0.9333 | 65800 | 0.3755 |
380
+ | 0.2609 | 0.9362 | 66000 | 0.3754 |
381
+ | 0.4405 | 0.9390 | 66200 | 0.3754 |
382
+ | 0.4036 | 0.9418 | 66400 | 0.3754 |
383
+ | 0.3688 | 0.9447 | 66600 | 0.3753 |
384
+ | 0.3391 | 0.9475 | 66800 | 0.3752 |
385
+ | 0.466 | 0.9504 | 67000 | 0.3751 |
386
+ | 0.4023 | 0.9532 | 67200 | 0.3751 |
387
+ | 0.4671 | 0.9560 | 67400 | 0.3750 |
388
+ | 0.2545 | 0.9589 | 67600 | 0.3750 |
389
+ | 0.2524 | 0.9617 | 67800 | 0.3749 |
390
+ | 0.3833 | 0.9645 | 68000 | 0.3749 |
391
+ | 0.4234 | 0.9674 | 68200 | 0.3749 |
392
+ | 0.4267 | 0.9702 | 68400 | 0.3748 |
393
+ | 0.3799 | 0.9730 | 68600 | 0.3747 |
394
+ | 0.2952 | 0.9759 | 68800 | 0.3747 |
395
+ | 0.2221 | 0.9787 | 69000 | 0.3746 |
396
+ | 0.4635 | 0.9816 | 69200 | 0.3746 |
397
+ | 0.2814 | 0.9844 | 69400 | 0.3745 |
398
+ | 0.3765 | 0.9872 | 69600 | 0.3745 |
399
+ | 0.4394 | 0.9901 | 69800 | 0.3745 |
400
+ | 0.4303 | 0.9929 | 70000 | 0.3744 |
401
+ | 0.2866 | 0.9957 | 70200 | 0.3744 |
402
+ | 0.4629 | 0.9986 | 70400 | 0.3744 |
403
+
404
+
405
+ ### Framework versions
406
+
407
+ - PEFT 0.12.0
408
+ - Transformers 4.45.2
409
+ - Pytorch 2.4.0+cu121
410
+ - Datasets 3.0.0
411
+ - Tokenizers 0.20.1
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:67d52a571ea3959ed26af0a1f96f15699e2e7dd3f39eb611407e28858e2a23f3
3
  size 671149168
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c029feb53b9496cbfd49e4345b67fba07bcd27db36bcc288060809c2f600a091
3
  size 671149168