agentlans commited on
Commit
b0c747c
1 Parent(s): fc828cb

Upload 8 files

Browse files
README.md CHANGED
@@ -1,26 +1,22 @@
1
  ---
2
  library_name: transformers
3
  base_model: agentlans/multilingual-e5-small-aligned
4
- language:
5
- - multilingual
6
  tags:
7
  - generated_from_trainer
8
  model-index:
9
- - name: multilingual-e5-small-aligned-transformed-quality
10
  results: []
11
- datasets:
12
- - agentlans/en-translations
13
  ---
14
 
15
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
  should probably proofread and complete it, then remove this comment. -->
17
 
18
- # multilingual-e5-small-aligned-transformed-quality
19
 
20
  This model is a fine-tuned version of [agentlans/multilingual-e5-small-aligned](https://huggingface.co/agentlans/multilingual-e5-small-aligned) on an unknown dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.2604
23
- - Mse: 0.2604
24
 
25
  ## Model description
26
 
@@ -40,7 +36,7 @@ More information needed
40
 
41
  The following hyperparameters were used during training:
42
  - learning_rate: 5e-05
43
- - train_batch_size: 32
44
  - eval_batch_size: 8
45
  - seed: 42
46
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
@@ -51,9 +47,9 @@ The following hyperparameters were used during training:
51
 
52
  | Training Loss | Epoch | Step | Validation Loss | Mse |
53
  |:-------------:|:-----:|:-----:|:---------------:|:------:|
54
- | 0.2949 | 1.0 | 27096 | 0.2873 | 0.2873 |
55
- | 0.2239 | 2.0 | 54192 | 0.2671 | 0.2671 |
56
- | 0.1789 | 3.0 | 81288 | 0.2604 | 0.2604 |
57
 
58
 
59
  ### Framework versions
@@ -61,4 +57,4 @@ The following hyperparameters were used during training:
61
  - Transformers 4.46.3
62
  - Pytorch 2.5.1+cu124
63
  - Datasets 3.1.0
64
- - Tokenizers 0.20.3
 
1
  ---
2
  library_name: transformers
3
  base_model: agentlans/multilingual-e5-small-aligned
 
 
4
  tags:
5
  - generated_from_trainer
6
  model-index:
7
+ - name: multilingual-e5-small-aligned-quality-20241214-new
8
  results: []
 
 
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # multilingual-e5-small-aligned-quality-20241214-new
15
 
16
  This model is a fine-tuned version of [agentlans/multilingual-e5-small-aligned](https://huggingface.co/agentlans/multilingual-e5-small-aligned) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.1958
19
+ - Mse: 0.1958
20
 
21
  ## Model description
22
 
 
36
 
37
  The following hyperparameters were used during training:
38
  - learning_rate: 5e-05
39
+ - train_batch_size: 128
40
  - eval_batch_size: 8
41
  - seed: 42
42
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
 
47
 
48
  | Training Loss | Epoch | Step | Validation Loss | Mse |
49
  |:-------------:|:-----:|:-----:|:---------------:|:------:|
50
+ | 0.2436 | 1.0 | 7813 | 0.2296 | 0.2296 |
51
+ | 0.1927 | 2.0 | 15626 | 0.2079 | 0.2079 |
52
+ | 0.1615 | 3.0 | 23439 | 0.1958 | 0.1958 |
53
 
54
 
55
  ### Framework versions
 
57
  - Transformers 4.46.3
58
  - Pytorch 2.5.1+cu124
59
  - Datasets 3.1.0
60
+ - Tokenizers 0.20.3
all_results.json CHANGED
@@ -1,15 +1,15 @@
1
  {
2
  "epoch": 3.0,
3
- "eval_loss": 0.26044028997421265,
4
- "eval_mse": 0.2604402849126996,
5
- "eval_runtime": 53.5863,
6
- "eval_samples": 96338,
7
- "eval_samples_per_second": 1797.81,
8
- "eval_steps_per_second": 224.74,
9
- "total_flos": 4.283504864539085e+16,
10
- "train_loss": 0.24568356713046358,
11
- "train_runtime": 4418.2249,
12
- "train_samples": 867042,
13
- "train_samples_per_second": 588.726,
14
- "train_steps_per_second": 18.398
15
  }
 
1
  {
2
  "epoch": 3.0,
3
+ "eval_loss": 0.19583497941493988,
4
+ "eval_mse": 0.19583499065839644,
5
+ "eval_runtime": 99.5388,
6
+ "eval_samples": 182111,
7
+ "eval_samples_per_second": 1829.547,
8
+ "eval_steps_per_second": 228.695,
9
+ "total_flos": 4.9403660544e+16,
10
+ "train_loss": 0.21201796470442558,
11
+ "train_runtime": 3304.0326,
12
+ "train_samples": 1000000,
13
+ "train_samples_per_second": 907.981,
14
+ "train_steps_per_second": 7.094
15
  }
eval_results.json CHANGED
@@ -1,9 +1,9 @@
1
  {
2
  "epoch": 3.0,
3
- "eval_loss": 0.26044028997421265,
4
- "eval_mse": 0.2604402849126996,
5
- "eval_runtime": 53.5863,
6
- "eval_samples": 96338,
7
- "eval_samples_per_second": 1797.81,
8
- "eval_steps_per_second": 224.74
9
  }
 
1
  {
2
  "epoch": 3.0,
3
+ "eval_loss": 0.19583497941493988,
4
+ "eval_mse": 0.19583499065839644,
5
+ "eval_runtime": 99.5388,
6
+ "eval_samples": 182111,
7
+ "eval_samples_per_second": 1829.547,
8
+ "eval_steps_per_second": 228.695
9
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c4f0eb3cb08c59ac8a80384938cb076a85795887de939c5f213c14fdd935883f
3
  size 470640124
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7fb2b10048a4dbfcb9f9a0c7afece210a37c49fe8c498d4fb7774a0b9162d16e
3
  size 470640124
train_results.json CHANGED
@@ -1,9 +1,9 @@
1
  {
2
  "epoch": 3.0,
3
- "total_flos": 4.283504864539085e+16,
4
- "train_loss": 0.24568356713046358,
5
- "train_runtime": 4418.2249,
6
- "train_samples": 867042,
7
- "train_samples_per_second": 588.726,
8
- "train_steps_per_second": 18.398
9
  }
 
1
  {
2
  "epoch": 3.0,
3
+ "total_flos": 4.9403660544e+16,
4
+ "train_loss": 0.21201796470442558,
5
+ "train_runtime": 3304.0326,
6
+ "train_samples": 1000000,
7
+ "train_samples_per_second": 907.981,
8
+ "train_steps_per_second": 7.094
9
  }
trainer_state.json CHANGED
@@ -1,1186 +1,374 @@
1
  {
2
- "best_metric": 0.26044028997421265,
3
- "best_model_checkpoint": "multilingual-e5-small-aligned-transformed-quality/checkpoint-81288",
4
  "epoch": 3.0,
5
  "eval_steps": 500,
6
- "global_step": 81288,
7
  "is_hyper_param_search": false,
8
  "is_local_process_zero": true,
9
  "is_world_process_zero": true,
10
  "log_history": [
11
  {
12
- "epoch": 0.018452908178328904,
13
- "grad_norm": 1.5689656734466553,
14
- "learning_rate": 4.969245153036119e-05,
15
- "loss": 0.4394,
16
  "step": 500
17
  },
18
  {
19
- "epoch": 0.03690581635665781,
20
- "grad_norm": 3.1030237674713135,
21
- "learning_rate": 4.938490306072237e-05,
22
- "loss": 0.3911,
23
  "step": 1000
24
  },
25
  {
26
- "epoch": 0.05535872453498671,
27
- "grad_norm": 1.7122437953948975,
28
- "learning_rate": 4.907735459108356e-05,
29
- "loss": 0.3661,
30
  "step": 1500
31
  },
32
  {
33
- "epoch": 0.07381163271331562,
34
- "grad_norm": 4.44265079498291,
35
- "learning_rate": 4.876980612144474e-05,
36
- "loss": 0.3665,
37
  "step": 2000
38
  },
39
  {
40
- "epoch": 0.09226454089164453,
41
- "grad_norm": 3.232088088989258,
42
- "learning_rate": 4.846225765180593e-05,
43
- "loss": 0.3581,
44
  "step": 2500
45
  },
46
  {
47
- "epoch": 0.11071744906997343,
48
- "grad_norm": 4.422216415405273,
49
- "learning_rate": 4.815470918216711e-05,
50
- "loss": 0.3606,
51
  "step": 3000
52
  },
53
  {
54
- "epoch": 0.12917035724830234,
55
- "grad_norm": 2.604503631591797,
56
- "learning_rate": 4.78471607125283e-05,
57
- "loss": 0.3492,
58
  "step": 3500
59
  },
60
  {
61
- "epoch": 0.14762326542663123,
62
- "grad_norm": 1.7900758981704712,
63
- "learning_rate": 4.7539612242889484e-05,
64
- "loss": 0.346,
65
  "step": 4000
66
  },
67
  {
68
- "epoch": 0.16607617360496013,
69
- "grad_norm": 2.164018154144287,
70
- "learning_rate": 4.723206377325067e-05,
71
- "loss": 0.3405,
72
  "step": 4500
73
  },
74
  {
75
- "epoch": 0.18452908178328906,
76
- "grad_norm": 3.678737163543701,
77
- "learning_rate": 4.692451530361185e-05,
78
- "loss": 0.342,
79
  "step": 5000
80
  },
81
  {
82
- "epoch": 0.20298198996161795,
83
- "grad_norm": 3.0846383571624756,
84
- "learning_rate": 4.661696683397304e-05,
85
- "loss": 0.3323,
86
  "step": 5500
87
  },
88
  {
89
- "epoch": 0.22143489813994685,
90
- "grad_norm": 2.493079900741577,
91
- "learning_rate": 4.6309418364334224e-05,
92
- "loss": 0.3403,
93
  "step": 6000
94
  },
95
  {
96
- "epoch": 0.23988780631827575,
97
- "grad_norm": 3.3876466751098633,
98
- "learning_rate": 4.60018698946954e-05,
99
- "loss": 0.3398,
100
  "step": 6500
101
  },
102
  {
103
- "epoch": 0.2583407144966047,
104
- "grad_norm": 2.3498599529266357,
105
- "learning_rate": 4.5694321425056594e-05,
106
- "loss": 0.3285,
107
  "step": 7000
108
  },
109
  {
110
- "epoch": 0.27679362267493357,
111
- "grad_norm": 2.9931955337524414,
112
- "learning_rate": 4.538677295541778e-05,
113
- "loss": 0.3315,
114
  "step": 7500
115
  },
116
  {
117
- "epoch": 0.29524653085326247,
118
- "grad_norm": 2.410051107406616,
119
- "learning_rate": 4.507922448577896e-05,
120
- "loss": 0.3245,
 
 
 
 
 
 
 
 
 
121
  "step": 8000
122
  },
123
  {
124
- "epoch": 0.31369943903159137,
125
- "grad_norm": 2.8106865882873535,
126
- "learning_rate": 4.477167601614014e-05,
127
- "loss": 0.3357,
128
  "step": 8500
129
  },
130
  {
131
- "epoch": 0.33215234720992026,
132
- "grad_norm": 1.4903161525726318,
133
- "learning_rate": 4.4464127546501335e-05,
134
- "loss": 0.3267,
135
  "step": 9000
136
  },
137
  {
138
- "epoch": 0.3506052553882492,
139
- "grad_norm": 2.8870410919189453,
140
- "learning_rate": 4.415657907686251e-05,
141
- "loss": 0.3276,
142
  "step": 9500
143
  },
144
  {
145
- "epoch": 0.3690581635665781,
146
- "grad_norm": 3.0932767391204834,
147
- "learning_rate": 4.38490306072237e-05,
148
- "loss": 0.3287,
149
  "step": 10000
150
  },
151
  {
152
- "epoch": 0.387511071744907,
153
- "grad_norm": 2.0650179386138916,
154
- "learning_rate": 4.354148213758489e-05,
155
- "loss": 0.3242,
156
  "step": 10500
157
  },
158
  {
159
- "epoch": 0.4059639799232359,
160
- "grad_norm": 2.1519172191619873,
161
- "learning_rate": 4.323393366794607e-05,
162
- "loss": 0.3227,
163
  "step": 11000
164
  },
165
  {
166
- "epoch": 0.4244168881015648,
167
- "grad_norm": 2.222637176513672,
168
- "learning_rate": 4.2926385198307254e-05,
169
- "loss": 0.3231,
170
  "step": 11500
171
  },
172
  {
173
- "epoch": 0.4428697962798937,
174
- "grad_norm": 3.189814805984497,
175
- "learning_rate": 4.261883672866844e-05,
176
- "loss": 0.3179,
177
  "step": 12000
178
  },
179
  {
180
- "epoch": 0.4613227044582226,
181
- "grad_norm": 2.6109983921051025,
182
- "learning_rate": 4.2311288259029624e-05,
183
- "loss": 0.3179,
184
  "step": 12500
185
  },
186
  {
187
- "epoch": 0.4797756126365515,
188
- "grad_norm": 2.1001720428466797,
189
- "learning_rate": 4.200373978939081e-05,
190
- "loss": 0.3156,
191
  "step": 13000
192
  },
193
  {
194
- "epoch": 0.49822852081488045,
195
- "grad_norm": 1.8664593696594238,
196
- "learning_rate": 4.1696191319751994e-05,
197
- "loss": 0.3135,
198
  "step": 13500
199
  },
200
  {
201
- "epoch": 0.5166814289932093,
202
- "grad_norm": 2.5857434272766113,
203
- "learning_rate": 4.138864285011318e-05,
204
- "loss": 0.3129,
205
  "step": 14000
206
  },
207
  {
208
- "epoch": 0.5351343371715382,
209
- "grad_norm": 1.891746997833252,
210
- "learning_rate": 4.1081094380474365e-05,
211
- "loss": 0.3134,
212
  "step": 14500
213
  },
214
  {
215
- "epoch": 0.5535872453498671,
216
- "grad_norm": 2.4689295291900635,
217
- "learning_rate": 4.077354591083555e-05,
218
- "loss": 0.3186,
219
  "step": 15000
220
  },
221
  {
222
- "epoch": 0.572040153528196,
223
- "grad_norm": 2.3689334392547607,
224
- "learning_rate": 4.0465997441196735e-05,
225
- "loss": 0.3109,
226
  "step": 15500
227
  },
228
  {
229
- "epoch": 0.5904930617065249,
230
- "grad_norm": 3.297631025314331,
231
- "learning_rate": 4.015844897155792e-05,
232
- "loss": 0.3115,
 
 
 
 
 
 
 
 
 
233
  "step": 16000
234
  },
235
  {
236
- "epoch": 0.6089459698848538,
237
- "grad_norm": 3.3874332904815674,
238
- "learning_rate": 3.9850900501919105e-05,
239
- "loss": 0.312,
240
  "step": 16500
241
  },
242
  {
243
- "epoch": 0.6273988780631827,
244
- "grad_norm": 2.9138197898864746,
245
- "learning_rate": 3.954335203228029e-05,
246
- "loss": 0.3126,
247
  "step": 17000
248
  },
249
  {
250
- "epoch": 0.6458517862415116,
251
- "grad_norm": 2.6313183307647705,
252
- "learning_rate": 3.9235803562641475e-05,
253
- "loss": 0.3013,
254
  "step": 17500
255
  },
256
  {
257
- "epoch": 0.6643046944198405,
258
- "grad_norm": 2.1537322998046875,
259
- "learning_rate": 3.892825509300266e-05,
260
- "loss": 0.3045,
261
  "step": 18000
262
  },
263
  {
264
- "epoch": 0.6827576025981694,
265
- "grad_norm": 2.0563058853149414,
266
- "learning_rate": 3.8620706623363846e-05,
267
- "loss": 0.3096,
268
  "step": 18500
269
  },
270
  {
271
- "epoch": 0.7012105107764984,
272
- "grad_norm": 3.172734498977661,
273
- "learning_rate": 3.8313158153725024e-05,
274
- "loss": 0.3037,
275
  "step": 19000
276
  },
277
  {
278
- "epoch": 0.7196634189548273,
279
- "grad_norm": 2.6458899974823,
280
- "learning_rate": 3.8005609684086216e-05,
281
- "loss": 0.3086,
282
  "step": 19500
283
  },
284
  {
285
- "epoch": 0.7381163271331562,
286
- "grad_norm": 3.355177879333496,
287
- "learning_rate": 3.76980612144474e-05,
288
- "loss": 0.3002,
289
  "step": 20000
290
  },
291
  {
292
- "epoch": 0.7565692353114851,
293
- "grad_norm": 2.290302038192749,
294
- "learning_rate": 3.739051274480858e-05,
295
- "loss": 0.3031,
296
  "step": 20500
297
  },
298
  {
299
- "epoch": 0.775022143489814,
300
- "grad_norm": 2.995771884918213,
301
- "learning_rate": 3.708296427516977e-05,
302
- "loss": 0.2953,
303
  "step": 21000
304
  },
305
  {
306
- "epoch": 0.7934750516681429,
307
- "grad_norm": 2.9768190383911133,
308
- "learning_rate": 3.6775415805530957e-05,
309
- "loss": 0.3091,
310
  "step": 21500
311
  },
312
  {
313
- "epoch": 0.8119279598464718,
314
- "grad_norm": 2.7021236419677734,
315
- "learning_rate": 3.6467867335892135e-05,
316
- "loss": 0.292,
317
  "step": 22000
318
  },
319
  {
320
- "epoch": 0.8303808680248007,
321
- "grad_norm": 4.4983229637146,
322
- "learning_rate": 3.616031886625332e-05,
323
- "loss": 0.3024,
324
  "step": 22500
325
  },
326
  {
327
- "epoch": 0.8488337762031296,
328
- "grad_norm": 2.2046918869018555,
329
- "learning_rate": 3.585277039661451e-05,
330
- "loss": 0.3031,
331
  "step": 23000
332
  },
333
- {
334
- "epoch": 0.8672866843814585,
335
- "grad_norm": 3.147702932357788,
336
- "learning_rate": 3.554522192697569e-05,
337
- "loss": 0.2923,
338
- "step": 23500
339
- },
340
- {
341
- "epoch": 0.8857395925597874,
342
- "grad_norm": 2.4293417930603027,
343
- "learning_rate": 3.5237673457336876e-05,
344
- "loss": 0.2962,
345
- "step": 24000
346
- },
347
- {
348
- "epoch": 0.9041925007381163,
349
- "grad_norm": 1.7400178909301758,
350
- "learning_rate": 3.493012498769807e-05,
351
- "loss": 0.3017,
352
- "step": 24500
353
- },
354
- {
355
- "epoch": 0.9226454089164452,
356
- "grad_norm": 2.229633331298828,
357
- "learning_rate": 3.4622576518059246e-05,
358
- "loss": 0.2961,
359
- "step": 25000
360
- },
361
- {
362
- "epoch": 0.9410983170947741,
363
- "grad_norm": 2.479665517807007,
364
- "learning_rate": 3.431502804842043e-05,
365
- "loss": 0.2961,
366
- "step": 25500
367
- },
368
- {
369
- "epoch": 0.959551225273103,
370
- "grad_norm": 2.990281820297241,
371
- "learning_rate": 3.400747957878162e-05,
372
- "loss": 0.2932,
373
- "step": 26000
374
- },
375
- {
376
- "epoch": 0.978004133451432,
377
- "grad_norm": 2.545665979385376,
378
- "learning_rate": 3.36999311091428e-05,
379
- "loss": 0.3006,
380
- "step": 26500
381
- },
382
- {
383
- "epoch": 0.9964570416297609,
384
- "grad_norm": 3.353132486343384,
385
- "learning_rate": 3.3392382639503986e-05,
386
- "loss": 0.2949,
387
- "step": 27000
388
- },
389
- {
390
- "epoch": 1.0,
391
- "eval_loss": 0.2873324751853943,
392
- "eval_mse": 0.28733249980634695,
393
- "eval_runtime": 59.3439,
394
- "eval_samples_per_second": 1623.385,
395
- "eval_steps_per_second": 202.936,
396
- "step": 27096
397
- },
398
- {
399
- "epoch": 1.0149099498080898,
400
- "grad_norm": 1.8382848501205444,
401
- "learning_rate": 3.308483416986517e-05,
402
- "loss": 0.2532,
403
- "step": 27500
404
- },
405
- {
406
- "epoch": 1.0333628579864187,
407
- "grad_norm": 2.252268075942993,
408
- "learning_rate": 3.277728570022636e-05,
409
- "loss": 0.2366,
410
- "step": 28000
411
- },
412
- {
413
- "epoch": 1.0518157661647476,
414
- "grad_norm": 2.4993882179260254,
415
- "learning_rate": 3.246973723058754e-05,
416
- "loss": 0.2431,
417
- "step": 28500
418
- },
419
- {
420
- "epoch": 1.0702686743430765,
421
- "grad_norm": 2.5144221782684326,
422
- "learning_rate": 3.216218876094873e-05,
423
- "loss": 0.2421,
424
- "step": 29000
425
- },
426
- {
427
- "epoch": 1.0887215825214054,
428
- "grad_norm": 2.1316707134246826,
429
- "learning_rate": 3.185464029130991e-05,
430
- "loss": 0.2401,
431
- "step": 29500
432
- },
433
- {
434
- "epoch": 1.1071744906997343,
435
- "grad_norm": 2.0435242652893066,
436
- "learning_rate": 3.15470918216711e-05,
437
- "loss": 0.2339,
438
- "step": 30000
439
- },
440
- {
441
- "epoch": 1.1256273988780632,
442
- "grad_norm": 3.039565086364746,
443
- "learning_rate": 3.123954335203228e-05,
444
- "loss": 0.2424,
445
- "step": 30500
446
- },
447
- {
448
- "epoch": 1.144080307056392,
449
- "grad_norm": 2.4302680492401123,
450
- "learning_rate": 3.093199488239347e-05,
451
- "loss": 0.239,
452
- "step": 31000
453
- },
454
- {
455
- "epoch": 1.162533215234721,
456
- "grad_norm": 2.2677814960479736,
457
- "learning_rate": 3.062444641275465e-05,
458
- "loss": 0.2394,
459
- "step": 31500
460
- },
461
- {
462
- "epoch": 1.1809861234130499,
463
- "grad_norm": 3.0901999473571777,
464
- "learning_rate": 3.0316897943115834e-05,
465
- "loss": 0.2323,
466
- "step": 32000
467
- },
468
- {
469
- "epoch": 1.1994390315913788,
470
- "grad_norm": 2.9839725494384766,
471
- "learning_rate": 3.0009349473477023e-05,
472
- "loss": 0.2387,
473
- "step": 32500
474
- },
475
- {
476
- "epoch": 1.2178919397697077,
477
- "grad_norm": 2.2191145420074463,
478
- "learning_rate": 2.9701801003838208e-05,
479
- "loss": 0.2388,
480
- "step": 33000
481
- },
482
- {
483
- "epoch": 1.2363448479480366,
484
- "grad_norm": 2.5112791061401367,
485
- "learning_rate": 2.939425253419939e-05,
486
- "loss": 0.2378,
487
- "step": 33500
488
- },
489
- {
490
- "epoch": 1.2547977561263655,
491
- "grad_norm": 2.370722770690918,
492
- "learning_rate": 2.9086704064560578e-05,
493
- "loss": 0.2414,
494
- "step": 34000
495
- },
496
- {
497
- "epoch": 1.2732506643046944,
498
- "grad_norm": 3.031116247177124,
499
- "learning_rate": 2.877915559492176e-05,
500
- "loss": 0.2349,
501
- "step": 34500
502
- },
503
- {
504
- "epoch": 1.2917035724830233,
505
- "grad_norm": 3.356889247894287,
506
- "learning_rate": 2.8471607125282945e-05,
507
- "loss": 0.2364,
508
- "step": 35000
509
- },
510
- {
511
- "epoch": 1.3101564806613522,
512
- "grad_norm": 3.58608078956604,
513
- "learning_rate": 2.8164058655644134e-05,
514
- "loss": 0.2435,
515
- "step": 35500
516
- },
517
- {
518
- "epoch": 1.328609388839681,
519
- "grad_norm": 3.02878999710083,
520
- "learning_rate": 2.7856510186005312e-05,
521
- "loss": 0.2419,
522
- "step": 36000
523
- },
524
- {
525
- "epoch": 1.34706229701801,
526
- "grad_norm": 2.696791648864746,
527
- "learning_rate": 2.75489617163665e-05,
528
- "loss": 0.2335,
529
- "step": 36500
530
- },
531
- {
532
- "epoch": 1.3655152051963388,
533
- "grad_norm": 3.62918758392334,
534
- "learning_rate": 2.7241413246727686e-05,
535
- "loss": 0.2335,
536
- "step": 37000
537
- },
538
- {
539
- "epoch": 1.3839681133746677,
540
- "grad_norm": 2.1558516025543213,
541
- "learning_rate": 2.6933864777088867e-05,
542
- "loss": 0.2312,
543
- "step": 37500
544
- },
545
- {
546
- "epoch": 1.4024210215529966,
547
- "grad_norm": 2.8393824100494385,
548
- "learning_rate": 2.6626316307450056e-05,
549
- "loss": 0.2366,
550
- "step": 38000
551
- },
552
- {
553
- "epoch": 1.4208739297313255,
554
- "grad_norm": 4.011019229888916,
555
- "learning_rate": 2.631876783781124e-05,
556
- "loss": 0.2341,
557
- "step": 38500
558
- },
559
- {
560
- "epoch": 1.4393268379096544,
561
- "grad_norm": 2.1811978816986084,
562
- "learning_rate": 2.6011219368172423e-05,
563
- "loss": 0.2367,
564
- "step": 39000
565
- },
566
- {
567
- "epoch": 1.4577797460879833,
568
- "grad_norm": 2.3741202354431152,
569
- "learning_rate": 2.570367089853361e-05,
570
- "loss": 0.2357,
571
- "step": 39500
572
- },
573
- {
574
- "epoch": 1.4762326542663124,
575
- "grad_norm": 1.7879256010055542,
576
- "learning_rate": 2.5396122428894797e-05,
577
- "loss": 0.2383,
578
- "step": 40000
579
- },
580
- {
581
- "epoch": 1.4946855624446413,
582
- "grad_norm": 2.9185972213745117,
583
- "learning_rate": 2.508857395925598e-05,
584
- "loss": 0.2383,
585
- "step": 40500
586
- },
587
- {
588
- "epoch": 1.51313847062297,
589
- "grad_norm": 2.2389113903045654,
590
- "learning_rate": 2.4781025489617167e-05,
591
- "loss": 0.2365,
592
- "step": 41000
593
- },
594
- {
595
- "epoch": 1.531591378801299,
596
- "grad_norm": 3.312300205230713,
597
- "learning_rate": 2.447347701997835e-05,
598
- "loss": 0.235,
599
- "step": 41500
600
- },
601
- {
602
- "epoch": 1.550044286979628,
603
- "grad_norm": 1.933483362197876,
604
- "learning_rate": 2.4165928550339534e-05,
605
- "loss": 0.2328,
606
- "step": 42000
607
- },
608
- {
609
- "epoch": 1.568497195157957,
610
- "grad_norm": 3.212892532348633,
611
- "learning_rate": 2.3858380080700722e-05,
612
- "loss": 0.2298,
613
- "step": 42500
614
- },
615
- {
616
- "epoch": 1.5869501033362858,
617
- "grad_norm": 2.561583995819092,
618
- "learning_rate": 2.3550831611061904e-05,
619
- "loss": 0.2334,
620
- "step": 43000
621
- },
622
- {
623
- "epoch": 1.6054030115146147,
624
- "grad_norm": 3.2696096897125244,
625
- "learning_rate": 2.324328314142309e-05,
626
- "loss": 0.239,
627
- "step": 43500
628
- },
629
- {
630
- "epoch": 1.6238559196929436,
631
- "grad_norm": 2.1827101707458496,
632
- "learning_rate": 2.2935734671784274e-05,
633
- "loss": 0.2305,
634
- "step": 44000
635
- },
636
- {
637
- "epoch": 1.6423088278712725,
638
- "grad_norm": 2.3663787841796875,
639
- "learning_rate": 2.262818620214546e-05,
640
- "loss": 0.2282,
641
- "step": 44500
642
- },
643
- {
644
- "epoch": 1.6607617360496014,
645
- "grad_norm": 2.3439533710479736,
646
- "learning_rate": 2.2320637732506645e-05,
647
- "loss": 0.2319,
648
- "step": 45000
649
- },
650
- {
651
- "epoch": 1.6792146442279303,
652
- "grad_norm": 1.9831467866897583,
653
- "learning_rate": 2.201308926286783e-05,
654
- "loss": 0.2301,
655
- "step": 45500
656
- },
657
- {
658
- "epoch": 1.6976675524062592,
659
- "grad_norm": 2.6936724185943604,
660
- "learning_rate": 2.1705540793229015e-05,
661
- "loss": 0.229,
662
- "step": 46000
663
- },
664
- {
665
- "epoch": 1.7161204605845881,
666
- "grad_norm": 2.882948160171509,
667
- "learning_rate": 2.13979923235902e-05,
668
- "loss": 0.2312,
669
- "step": 46500
670
- },
671
- {
672
- "epoch": 1.734573368762917,
673
- "grad_norm": 2.604907274246216,
674
- "learning_rate": 2.1090443853951382e-05,
675
- "loss": 0.2251,
676
- "step": 47000
677
- },
678
- {
679
- "epoch": 1.753026276941246,
680
- "grad_norm": 2.0118274688720703,
681
- "learning_rate": 2.0782895384312567e-05,
682
- "loss": 0.2298,
683
- "step": 47500
684
- },
685
- {
686
- "epoch": 1.7714791851195748,
687
- "grad_norm": 3.394923448562622,
688
- "learning_rate": 2.0475346914673755e-05,
689
- "loss": 0.235,
690
- "step": 48000
691
- },
692
- {
693
- "epoch": 1.7899320932979037,
694
- "grad_norm": 2.554750919342041,
695
- "learning_rate": 2.0167798445034937e-05,
696
- "loss": 0.2336,
697
- "step": 48500
698
- },
699
- {
700
- "epoch": 1.8083850014762326,
701
- "grad_norm": 2.264559030532837,
702
- "learning_rate": 1.9860249975396122e-05,
703
- "loss": 0.2252,
704
- "step": 49000
705
- },
706
- {
707
- "epoch": 1.8268379096545617,
708
- "grad_norm": 3.1084542274475098,
709
- "learning_rate": 1.955270150575731e-05,
710
- "loss": 0.228,
711
- "step": 49500
712
- },
713
- {
714
- "epoch": 1.8452908178328906,
715
- "grad_norm": 3.497529983520508,
716
- "learning_rate": 1.9245153036118493e-05,
717
- "loss": 0.2262,
718
- "step": 50000
719
- },
720
- {
721
- "epoch": 1.8637437260112195,
722
- "grad_norm": 2.686687707901001,
723
- "learning_rate": 1.8937604566479678e-05,
724
- "loss": 0.2238,
725
- "step": 50500
726
- },
727
- {
728
- "epoch": 1.8821966341895484,
729
- "grad_norm": 3.541355609893799,
730
- "learning_rate": 1.8630056096840863e-05,
731
- "loss": 0.2273,
732
- "step": 51000
733
- },
734
- {
735
- "epoch": 1.9006495423678773,
736
- "grad_norm": 2.5415749549865723,
737
- "learning_rate": 1.8322507627202048e-05,
738
- "loss": 0.2298,
739
- "step": 51500
740
- },
741
- {
742
- "epoch": 1.9191024505462062,
743
- "grad_norm": 2.017357349395752,
744
- "learning_rate": 1.8014959157563233e-05,
745
- "loss": 0.2285,
746
- "step": 52000
747
- },
748
- {
749
- "epoch": 1.937555358724535,
750
- "grad_norm": 2.1992979049682617,
751
- "learning_rate": 1.7707410687924418e-05,
752
- "loss": 0.2322,
753
- "step": 52500
754
- },
755
- {
756
- "epoch": 1.956008266902864,
757
- "grad_norm": 3.5973832607269287,
758
- "learning_rate": 1.7399862218285603e-05,
759
- "loss": 0.2282,
760
- "step": 53000
761
- },
762
- {
763
- "epoch": 1.974461175081193,
764
- "grad_norm": 2.55898380279541,
765
- "learning_rate": 1.709231374864679e-05,
766
- "loss": 0.2257,
767
- "step": 53500
768
- },
769
- {
770
- "epoch": 1.9929140832595218,
771
- "grad_norm": 1.7618777751922607,
772
- "learning_rate": 1.678476527900797e-05,
773
- "loss": 0.2239,
774
- "step": 54000
775
- },
776
- {
777
- "epoch": 2.0,
778
- "eval_loss": 0.2671372890472412,
779
- "eval_mse": 0.2671373049978385,
780
- "eval_runtime": 53.3061,
781
- "eval_samples_per_second": 1807.261,
782
- "eval_steps_per_second": 225.922,
783
- "step": 54192
784
- },
785
- {
786
- "epoch": 2.0113669914378507,
787
- "grad_norm": 1.9996424913406372,
788
- "learning_rate": 1.647721680936916e-05,
789
- "loss": 0.2054,
790
- "step": 54500
791
- },
792
- {
793
- "epoch": 2.0298198996161796,
794
- "grad_norm": 3.9520423412323,
795
- "learning_rate": 1.6169668339730344e-05,
796
- "loss": 0.1834,
797
- "step": 55000
798
- },
799
- {
800
- "epoch": 2.0482728077945085,
801
- "grad_norm": 2.5493485927581787,
802
- "learning_rate": 1.5862119870091526e-05,
803
- "loss": 0.183,
804
- "step": 55500
805
- },
806
- {
807
- "epoch": 2.0667257159728374,
808
- "grad_norm": 2.3871965408325195,
809
- "learning_rate": 1.555457140045271e-05,
810
- "loss": 0.182,
811
- "step": 56000
812
- },
813
- {
814
- "epoch": 2.0851786241511663,
815
- "grad_norm": 2.607405185699463,
816
- "learning_rate": 1.5247022930813898e-05,
817
- "loss": 0.1831,
818
- "step": 56500
819
- },
820
- {
821
- "epoch": 2.103631532329495,
822
- "grad_norm": 3.14758038520813,
823
- "learning_rate": 1.4939474461175081e-05,
824
- "loss": 0.1802,
825
- "step": 57000
826
- },
827
- {
828
- "epoch": 2.122084440507824,
829
- "grad_norm": 2.3298392295837402,
830
- "learning_rate": 1.4631925991536266e-05,
831
- "loss": 0.1886,
832
- "step": 57500
833
- },
834
- {
835
- "epoch": 2.140537348686153,
836
- "grad_norm": 2.356092929840088,
837
- "learning_rate": 1.4324377521897453e-05,
838
- "loss": 0.1866,
839
- "step": 58000
840
- },
841
- {
842
- "epoch": 2.158990256864482,
843
- "grad_norm": 2.599311113357544,
844
- "learning_rate": 1.4016829052258637e-05,
845
- "loss": 0.1799,
846
- "step": 58500
847
- },
848
- {
849
- "epoch": 2.1774431650428108,
850
- "grad_norm": 2.818366050720215,
851
- "learning_rate": 1.3709280582619822e-05,
852
- "loss": 0.1885,
853
- "step": 59000
854
- },
855
- {
856
- "epoch": 2.1958960732211397,
857
- "grad_norm": 2.1913158893585205,
858
- "learning_rate": 1.3401732112981005e-05,
859
- "loss": 0.1817,
860
- "step": 59500
861
- },
862
- {
863
- "epoch": 2.2143489813994686,
864
- "grad_norm": 1.8911914825439453,
865
- "learning_rate": 1.3094183643342192e-05,
866
- "loss": 0.1837,
867
- "step": 60000
868
- },
869
- {
870
- "epoch": 2.2328018895777975,
871
- "grad_norm": 1.7305101156234741,
872
- "learning_rate": 1.2786635173703375e-05,
873
- "loss": 0.18,
874
- "step": 60500
875
- },
876
- {
877
- "epoch": 2.2512547977561264,
878
- "grad_norm": 2.5203235149383545,
879
- "learning_rate": 1.2479086704064562e-05,
880
- "loss": 0.1874,
881
- "step": 61000
882
- },
883
- {
884
- "epoch": 2.2697077059344553,
885
- "grad_norm": 2.4888010025024414,
886
- "learning_rate": 1.2171538234425746e-05,
887
- "loss": 0.1834,
888
- "step": 61500
889
- },
890
- {
891
- "epoch": 2.288160614112784,
892
- "grad_norm": 2.570237159729004,
893
- "learning_rate": 1.186398976478693e-05,
894
- "loss": 0.1845,
895
- "step": 62000
896
- },
897
- {
898
- "epoch": 2.306613522291113,
899
- "grad_norm": 3.0238893032073975,
900
- "learning_rate": 1.1556441295148116e-05,
901
- "loss": 0.1862,
902
- "step": 62500
903
- },
904
- {
905
- "epoch": 2.325066430469442,
906
- "grad_norm": 2.6648924350738525,
907
- "learning_rate": 1.1248892825509301e-05,
908
- "loss": 0.1858,
909
- "step": 63000
910
- },
911
- {
912
- "epoch": 2.343519338647771,
913
- "grad_norm": 2.760099411010742,
914
- "learning_rate": 1.0941344355870485e-05,
915
- "loss": 0.1836,
916
- "step": 63500
917
- },
918
- {
919
- "epoch": 2.3619722468260997,
920
- "grad_norm": 2.4213175773620605,
921
- "learning_rate": 1.0633795886231671e-05,
922
- "loss": 0.1798,
923
- "step": 64000
924
- },
925
- {
926
- "epoch": 2.3804251550044286,
927
- "grad_norm": 3.6033475399017334,
928
- "learning_rate": 1.0326247416592857e-05,
929
- "loss": 0.1811,
930
- "step": 64500
931
- },
932
- {
933
- "epoch": 2.3988780631827575,
934
- "grad_norm": 2.590794086456299,
935
- "learning_rate": 1.001869894695404e-05,
936
- "loss": 0.1784,
937
- "step": 65000
938
- },
939
- {
940
- "epoch": 2.4173309713610864,
941
- "grad_norm": 2.297030448913574,
942
- "learning_rate": 9.711150477315225e-06,
943
- "loss": 0.177,
944
- "step": 65500
945
- },
946
- {
947
- "epoch": 2.4357838795394153,
948
- "grad_norm": 2.6488535404205322,
949
- "learning_rate": 9.40360200767641e-06,
950
- "loss": 0.1804,
951
- "step": 66000
952
- },
953
- {
954
- "epoch": 2.4542367877177442,
955
- "grad_norm": 2.314232349395752,
956
- "learning_rate": 9.096053538037595e-06,
957
- "loss": 0.1808,
958
- "step": 66500
959
- },
960
- {
961
- "epoch": 2.472689695896073,
962
- "grad_norm": 2.5361621379852295,
963
- "learning_rate": 8.78850506839878e-06,
964
- "loss": 0.1814,
965
- "step": 67000
966
- },
967
- {
968
- "epoch": 2.491142604074402,
969
- "grad_norm": 1.9473010301589966,
970
- "learning_rate": 8.480956598759966e-06,
971
- "loss": 0.1819,
972
- "step": 67500
973
- },
974
- {
975
- "epoch": 2.509595512252731,
976
- "grad_norm": 1.6078424453735352,
977
- "learning_rate": 8.17340812912115e-06,
978
- "loss": 0.1821,
979
- "step": 68000
980
- },
981
- {
982
- "epoch": 2.52804842043106,
983
- "grad_norm": 2.9577689170837402,
984
- "learning_rate": 7.865859659482334e-06,
985
- "loss": 0.1816,
986
- "step": 68500
987
- },
988
- {
989
- "epoch": 2.5465013286093887,
990
- "grad_norm": 2.314099073410034,
991
- "learning_rate": 7.55831118984352e-06,
992
- "loss": 0.1813,
993
- "step": 69000
994
- },
995
- {
996
- "epoch": 2.5649542367877176,
997
- "grad_norm": 2.1542553901672363,
998
- "learning_rate": 7.250762720204704e-06,
999
- "loss": 0.1771,
1000
- "step": 69500
1001
- },
1002
- {
1003
- "epoch": 2.5834071449660465,
1004
- "grad_norm": 2.126908540725708,
1005
- "learning_rate": 6.94321425056589e-06,
1006
- "loss": 0.1807,
1007
- "step": 70000
1008
- },
1009
- {
1010
- "epoch": 2.6018600531443754,
1011
- "grad_norm": 2.9939024448394775,
1012
- "learning_rate": 6.635665780927075e-06,
1013
- "loss": 0.1792,
1014
- "step": 70500
1015
- },
1016
- {
1017
- "epoch": 2.6203129613227043,
1018
- "grad_norm": 3.436375856399536,
1019
- "learning_rate": 6.328117311288259e-06,
1020
- "loss": 0.1801,
1021
- "step": 71000
1022
- },
1023
- {
1024
- "epoch": 2.638765869501033,
1025
- "grad_norm": 2.515270233154297,
1026
- "learning_rate": 6.020568841649444e-06,
1027
- "loss": 0.1805,
1028
- "step": 71500
1029
- },
1030
- {
1031
- "epoch": 2.657218777679362,
1032
- "grad_norm": 2.924694061279297,
1033
- "learning_rate": 5.713020372010629e-06,
1034
- "loss": 0.1782,
1035
- "step": 72000
1036
- },
1037
- {
1038
- "epoch": 2.675671685857691,
1039
- "grad_norm": 2.6601603031158447,
1040
- "learning_rate": 5.4054719023718145e-06,
1041
- "loss": 0.1821,
1042
- "step": 72500
1043
- },
1044
- {
1045
- "epoch": 2.69412459403602,
1046
- "grad_norm": 2.6807141304016113,
1047
- "learning_rate": 5.097923432732999e-06,
1048
- "loss": 0.1793,
1049
- "step": 73000
1050
- },
1051
- {
1052
- "epoch": 2.712577502214349,
1053
- "grad_norm": 3.814213991165161,
1054
- "learning_rate": 4.790374963094184e-06,
1055
- "loss": 0.1814,
1056
- "step": 73500
1057
- },
1058
- {
1059
- "epoch": 2.7310304103926777,
1060
- "grad_norm": 2.8295676708221436,
1061
- "learning_rate": 4.482826493455368e-06,
1062
- "loss": 0.1761,
1063
- "step": 74000
1064
- },
1065
- {
1066
- "epoch": 2.7494833185710066,
1067
- "grad_norm": 3.3659756183624268,
1068
- "learning_rate": 4.175278023816553e-06,
1069
- "loss": 0.1823,
1070
- "step": 74500
1071
- },
1072
- {
1073
- "epoch": 2.7679362267493355,
1074
- "grad_norm": 2.6024508476257324,
1075
- "learning_rate": 3.8677295541777385e-06,
1076
- "loss": 0.1805,
1077
- "step": 75000
1078
- },
1079
- {
1080
- "epoch": 2.7863891349276644,
1081
- "grad_norm": 2.8737952709198,
1082
- "learning_rate": 3.5601810845389237e-06,
1083
- "loss": 0.173,
1084
- "step": 75500
1085
- },
1086
- {
1087
- "epoch": 2.8048420431059933,
1088
- "grad_norm": 2.042231559753418,
1089
- "learning_rate": 3.2526326149001084e-06,
1090
- "loss": 0.1748,
1091
- "step": 76000
1092
- },
1093
- {
1094
- "epoch": 2.823294951284322,
1095
- "grad_norm": 2.5914924144744873,
1096
- "learning_rate": 2.945084145261293e-06,
1097
- "loss": 0.1796,
1098
- "step": 76500
1099
- },
1100
- {
1101
- "epoch": 2.841747859462651,
1102
- "grad_norm": 2.8601739406585693,
1103
- "learning_rate": 2.6375356756224782e-06,
1104
- "loss": 0.178,
1105
- "step": 77000
1106
- },
1107
- {
1108
- "epoch": 2.86020076764098,
1109
- "grad_norm": 2.323885917663574,
1110
- "learning_rate": 2.3299872059836634e-06,
1111
- "loss": 0.179,
1112
- "step": 77500
1113
- },
1114
- {
1115
- "epoch": 2.878653675819309,
1116
- "grad_norm": 3.8116016387939453,
1117
- "learning_rate": 2.022438736344848e-06,
1118
- "loss": 0.1793,
1119
- "step": 78000
1120
- },
1121
- {
1122
- "epoch": 2.8971065839976378,
1123
- "grad_norm": 3.1566436290740967,
1124
- "learning_rate": 1.7148902667060328e-06,
1125
- "loss": 0.1733,
1126
- "step": 78500
1127
- },
1128
- {
1129
- "epoch": 2.9155594921759667,
1130
- "grad_norm": 2.2577366828918457,
1131
- "learning_rate": 1.4073417970672177e-06,
1132
- "loss": 0.1757,
1133
- "step": 79000
1134
- },
1135
- {
1136
- "epoch": 2.934012400354296,
1137
- "grad_norm": 2.2155442237854004,
1138
- "learning_rate": 1.0997933274284029e-06,
1139
- "loss": 0.1805,
1140
- "step": 79500
1141
- },
1142
- {
1143
- "epoch": 2.952465308532625,
1144
- "grad_norm": 1.9121520519256592,
1145
- "learning_rate": 7.922448577895876e-07,
1146
- "loss": 0.1775,
1147
- "step": 80000
1148
- },
1149
- {
1150
- "epoch": 2.970918216710954,
1151
- "grad_norm": 3.8823773860931396,
1152
- "learning_rate": 4.846963881507725e-07,
1153
- "loss": 0.1808,
1154
- "step": 80500
1155
- },
1156
- {
1157
- "epoch": 2.9893711248892827,
1158
- "grad_norm": 2.4405837059020996,
1159
- "learning_rate": 1.771479185119575e-07,
1160
- "loss": 0.1789,
1161
- "step": 81000
1162
- },
1163
  {
1164
  "epoch": 3.0,
1165
- "eval_loss": 0.26044028997421265,
1166
- "eval_mse": 0.2604402849126996,
1167
- "eval_runtime": 52.9352,
1168
- "eval_samples_per_second": 1819.924,
1169
- "eval_steps_per_second": 227.505,
1170
- "step": 81288
1171
  },
1172
  {
1173
  "epoch": 3.0,
1174
- "step": 81288,
1175
- "total_flos": 4.283504864539085e+16,
1176
- "train_loss": 0.24568356713046358,
1177
- "train_runtime": 4418.2249,
1178
- "train_samples_per_second": 588.726,
1179
- "train_steps_per_second": 18.398
1180
  }
1181
  ],
1182
  "logging_steps": 500,
1183
- "max_steps": 81288,
1184
  "num_input_tokens_seen": 0,
1185
  "num_train_epochs": 3,
1186
  "save_steps": 500,
@@ -1196,8 +384,8 @@
1196
  "attributes": {}
1197
  }
1198
  },
1199
- "total_flos": 4.283504864539085e+16,
1200
- "train_batch_size": 32,
1201
  "trial_name": null,
1202
  "trial_params": null
1203
  }
 
1
  {
2
+ "best_metric": 0.19583497941493988,
3
+ "best_model_checkpoint": "multilingual-e5-small-aligned-quality-20241214-new/checkpoint-23439",
4
  "epoch": 3.0,
5
  "eval_steps": 500,
6
+ "global_step": 23439,
7
  "is_hyper_param_search": false,
8
  "is_local_process_zero": true,
9
  "is_world_process_zero": true,
10
  "log_history": [
11
  {
12
+ "epoch": 0.06399590426212723,
13
+ "grad_norm": 1.3940926790237427,
14
+ "learning_rate": 4.8933401595631215e-05,
15
+ "loss": 0.3708,
16
  "step": 500
17
  },
18
  {
19
+ "epoch": 0.12799180852425446,
20
+ "grad_norm": 1.3164088726043701,
21
+ "learning_rate": 4.786680319126243e-05,
22
+ "loss": 0.3132,
23
  "step": 1000
24
  },
25
  {
26
+ "epoch": 0.19198771278638166,
27
+ "grad_norm": 1.3515187501907349,
28
+ "learning_rate": 4.680020478689364e-05,
29
+ "loss": 0.2973,
30
  "step": 1500
31
  },
32
  {
33
+ "epoch": 0.2559836170485089,
34
+ "grad_norm": 1.7389737367630005,
35
+ "learning_rate": 4.573360638252485e-05,
36
+ "loss": 0.2877,
37
  "step": 2000
38
  },
39
  {
40
+ "epoch": 0.3199795213106361,
41
+ "grad_norm": 1.2677313089370728,
42
+ "learning_rate": 4.4667007978156063e-05,
43
+ "loss": 0.2803,
44
  "step": 2500
45
  },
46
  {
47
+ "epoch": 0.3839754255727633,
48
+ "grad_norm": 1.3927785158157349,
49
+ "learning_rate": 4.360040957378728e-05,
50
+ "loss": 0.2743,
51
  "step": 3000
52
  },
53
  {
54
+ "epoch": 0.4479713298348906,
55
+ "grad_norm": 1.628334879875183,
56
+ "learning_rate": 4.2533811169418495e-05,
57
+ "loss": 0.2747,
58
  "step": 3500
59
  },
60
  {
61
+ "epoch": 0.5119672340970178,
62
+ "grad_norm": 1.4009227752685547,
63
+ "learning_rate": 4.146721276504971e-05,
64
+ "loss": 0.2672,
65
  "step": 4000
66
  },
67
  {
68
+ "epoch": 0.575963138359145,
69
+ "grad_norm": 1.4522840976715088,
70
+ "learning_rate": 4.040061436068092e-05,
71
+ "loss": 0.264,
72
  "step": 4500
73
  },
74
  {
75
+ "epoch": 0.6399590426212722,
76
+ "grad_norm": 1.5413362979888916,
77
+ "learning_rate": 3.933401595631213e-05,
78
+ "loss": 0.2588,
79
  "step": 5000
80
  },
81
  {
82
+ "epoch": 0.7039549468833994,
83
+ "grad_norm": 1.2421332597732544,
84
+ "learning_rate": 3.8267417551943344e-05,
85
+ "loss": 0.253,
86
  "step": 5500
87
  },
88
  {
89
+ "epoch": 0.7679508511455266,
90
+ "grad_norm": 1.3509016036987305,
91
+ "learning_rate": 3.7200819147574556e-05,
92
+ "loss": 0.2539,
93
  "step": 6000
94
  },
95
  {
96
+ "epoch": 0.831946755407654,
97
+ "grad_norm": 1.9007371664047241,
98
+ "learning_rate": 3.613422074320577e-05,
99
+ "loss": 0.2516,
100
  "step": 6500
101
  },
102
  {
103
+ "epoch": 0.8959426596697811,
104
+ "grad_norm": 1.3137174844741821,
105
+ "learning_rate": 3.506762233883698e-05,
106
+ "loss": 0.2464,
107
  "step": 7000
108
  },
109
  {
110
+ "epoch": 0.9599385639319084,
111
+ "grad_norm": 1.5280767679214478,
112
+ "learning_rate": 3.400102393446819e-05,
113
+ "loss": 0.2436,
114
  "step": 7500
115
  },
116
  {
117
+ "epoch": 1.0,
118
+ "eval_loss": 0.2296382337808609,
119
+ "eval_mse": 0.22963822030546574,
120
+ "eval_runtime": 103.0849,
121
+ "eval_samples_per_second": 1766.611,
122
+ "eval_steps_per_second": 220.828,
123
+ "step": 7813
124
+ },
125
+ {
126
+ "epoch": 1.0239344681940357,
127
+ "grad_norm": 1.2579801082611084,
128
+ "learning_rate": 3.293442553009941e-05,
129
+ "loss": 0.2307,
130
  "step": 8000
131
  },
132
  {
133
+ "epoch": 1.0879303724561629,
134
+ "grad_norm": 1.494954228401184,
135
+ "learning_rate": 3.1867827125730624e-05,
136
+ "loss": 0.2053,
137
  "step": 8500
138
  },
139
  {
140
+ "epoch": 1.15192627671829,
141
+ "grad_norm": 1.6762491464614868,
142
+ "learning_rate": 3.0801228721361836e-05,
143
+ "loss": 0.2021,
144
  "step": 9000
145
  },
146
  {
147
+ "epoch": 1.2159221809804173,
148
+ "grad_norm": 1.3708257675170898,
149
+ "learning_rate": 2.9734630316993045e-05,
150
+ "loss": 0.2034,
151
  "step": 9500
152
  },
153
  {
154
+ "epoch": 1.2799180852425445,
155
+ "grad_norm": 1.57413649559021,
156
+ "learning_rate": 2.866803191262426e-05,
157
+ "loss": 0.2016,
158
  "step": 10000
159
  },
160
  {
161
+ "epoch": 1.3439139895046717,
162
+ "grad_norm": 1.1981782913208008,
163
+ "learning_rate": 2.7601433508255476e-05,
164
+ "loss": 0.2014,
165
  "step": 10500
166
  },
167
  {
168
+ "epoch": 1.4079098937667989,
169
+ "grad_norm": 2.0707337856292725,
170
+ "learning_rate": 2.6534835103886685e-05,
171
+ "loss": 0.1996,
172
  "step": 11000
173
  },
174
  {
175
+ "epoch": 1.471905798028926,
176
+ "grad_norm": 1.3037790060043335,
177
+ "learning_rate": 2.54682366995179e-05,
178
+ "loss": 0.1998,
179
  "step": 11500
180
  },
181
  {
182
+ "epoch": 1.5359017022910533,
183
+ "grad_norm": 1.331900715827942,
184
+ "learning_rate": 2.4401638295149112e-05,
185
+ "loss": 0.1995,
186
  "step": 12000
187
  },
188
  {
189
+ "epoch": 1.5998976065531805,
190
+ "grad_norm": 1.6664810180664062,
191
+ "learning_rate": 2.3335039890780325e-05,
192
+ "loss": 0.1995,
193
  "step": 12500
194
  },
195
  {
196
+ "epoch": 1.6638935108153077,
197
+ "grad_norm": 1.518689751625061,
198
+ "learning_rate": 2.2268441486411537e-05,
199
+ "loss": 0.1981,
200
  "step": 13000
201
  },
202
  {
203
+ "epoch": 1.727889415077435,
204
+ "grad_norm": 1.4883790016174316,
205
+ "learning_rate": 2.120184308204275e-05,
206
+ "loss": 0.1954,
207
  "step": 13500
208
  },
209
  {
210
+ "epoch": 1.7918853193395623,
211
+ "grad_norm": 1.3662298917770386,
212
+ "learning_rate": 2.0135244677673965e-05,
213
+ "loss": 0.1957,
214
  "step": 14000
215
  },
216
  {
217
+ "epoch": 1.8558812236016895,
218
+ "grad_norm": 1.4358400106430054,
219
+ "learning_rate": 1.9068646273305177e-05,
220
+ "loss": 0.1943,
221
  "step": 14500
222
  },
223
  {
224
+ "epoch": 1.9198771278638167,
225
+ "grad_norm": 1.3663334846496582,
226
+ "learning_rate": 1.800204786893639e-05,
227
+ "loss": 0.1959,
228
  "step": 15000
229
  },
230
  {
231
+ "epoch": 1.983873032125944,
232
+ "grad_norm": 1.3956917524337769,
233
+ "learning_rate": 1.69354494645676e-05,
234
+ "loss": 0.1927,
235
  "step": 15500
236
  },
237
  {
238
+ "epoch": 2.0,
239
+ "eval_loss": 0.2078620195388794,
240
+ "eval_mse": 0.20786202507040266,
241
+ "eval_runtime": 99.6454,
242
+ "eval_samples_per_second": 1827.591,
243
+ "eval_steps_per_second": 228.45,
244
+ "step": 15626
245
+ },
246
+ {
247
+ "epoch": 2.0478689363880713,
248
+ "grad_norm": 1.112040638923645,
249
+ "learning_rate": 1.5868851060198814e-05,
250
+ "loss": 0.1722,
251
  "step": 16000
252
  },
253
  {
254
+ "epoch": 2.1118648406501985,
255
+ "grad_norm": 1.627584457397461,
256
+ "learning_rate": 1.480225265583003e-05,
257
+ "loss": 0.1648,
258
  "step": 16500
259
  },
260
  {
261
+ "epoch": 2.1758607449123257,
262
+ "grad_norm": 1.6189327239990234,
263
+ "learning_rate": 1.3735654251461241e-05,
264
+ "loss": 0.1637,
265
  "step": 17000
266
  },
267
  {
268
+ "epoch": 2.239856649174453,
269
+ "grad_norm": 1.2161808013916016,
270
+ "learning_rate": 1.2669055847092454e-05,
271
+ "loss": 0.1644,
272
  "step": 17500
273
  },
274
  {
275
+ "epoch": 2.30385255343658,
276
+ "grad_norm": 1.4423365592956543,
277
+ "learning_rate": 1.1602457442723666e-05,
278
+ "loss": 0.163,
279
  "step": 18000
280
  },
281
  {
282
+ "epoch": 2.3678484576987073,
283
+ "grad_norm": 1.298816204071045,
284
+ "learning_rate": 1.053585903835488e-05,
285
+ "loss": 0.1626,
286
  "step": 18500
287
  },
288
  {
289
+ "epoch": 2.4318443619608345,
290
+ "grad_norm": 1.2976384162902832,
291
+ "learning_rate": 9.469260633986092e-06,
292
+ "loss": 0.1627,
293
  "step": 19000
294
  },
295
  {
296
+ "epoch": 2.4958402662229617,
297
+ "grad_norm": 1.39438796043396,
298
+ "learning_rate": 8.402662229617304e-06,
299
+ "loss": 0.1633,
300
  "step": 19500
301
  },
302
  {
303
+ "epoch": 2.559836170485089,
304
+ "grad_norm": 1.342234492301941,
305
+ "learning_rate": 7.336063825248518e-06,
306
+ "loss": 0.162,
307
  "step": 20000
308
  },
309
  {
310
+ "epoch": 2.623832074747216,
311
+ "grad_norm": 1.3450177907943726,
312
+ "learning_rate": 6.26946542087973e-06,
313
+ "loss": 0.1619,
314
  "step": 20500
315
  },
316
  {
317
+ "epoch": 2.6878279790093433,
318
+ "grad_norm": 1.3149842023849487,
319
+ "learning_rate": 5.202867016510943e-06,
320
+ "loss": 0.1608,
321
  "step": 21000
322
  },
323
  {
324
+ "epoch": 2.7518238832714705,
325
+ "grad_norm": 1.3171745538711548,
326
+ "learning_rate": 4.1362686121421564e-06,
327
+ "loss": 0.1622,
328
  "step": 21500
329
  },
330
  {
331
+ "epoch": 2.8158197875335977,
332
+ "grad_norm": 1.0992498397827148,
333
+ "learning_rate": 3.069670207773369e-06,
334
+ "loss": 0.1608,
335
  "step": 22000
336
  },
337
  {
338
+ "epoch": 2.879815691795725,
339
+ "grad_norm": 1.3956327438354492,
340
+ "learning_rate": 2.003071803404582e-06,
341
+ "loss": 0.161,
342
  "step": 22500
343
  },
344
  {
345
+ "epoch": 2.943811596057852,
346
+ "grad_norm": 1.531733512878418,
347
+ "learning_rate": 9.364733990357951e-07,
348
+ "loss": 0.1615,
349
  "step": 23000
350
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
351
  {
352
  "epoch": 3.0,
353
+ "eval_loss": 0.19583497941493988,
354
+ "eval_mse": 0.19583499065839644,
355
+ "eval_runtime": 99.9929,
356
+ "eval_samples_per_second": 1821.239,
357
+ "eval_steps_per_second": 227.656,
358
+ "step": 23439
359
  },
360
  {
361
  "epoch": 3.0,
362
+ "step": 23439,
363
+ "total_flos": 4.9403660544e+16,
364
+ "train_loss": 0.21201796470442558,
365
+ "train_runtime": 3304.0326,
366
+ "train_samples_per_second": 907.981,
367
+ "train_steps_per_second": 7.094
368
  }
369
  ],
370
  "logging_steps": 500,
371
+ "max_steps": 23439,
372
  "num_input_tokens_seen": 0,
373
  "num_train_epochs": 3,
374
  "save_steps": 500,
 
384
  "attributes": {}
385
  }
386
  },
387
+ "total_flos": 4.9403660544e+16,
388
+ "train_batch_size": 128,
389
  "trial_name": null,
390
  "trial_params": null
391
  }
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7fb8dd27419a9946829959509b851f42cf22c307349e98461ea22e7782f4ee83
3
  size 5368
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8dc970f588e7b3f3c91f44dc7ac64c322e7c274490920c7b98653ebce45b58fc
3
  size 5368