jkazdan commited on
Commit
adba520
·
verified ·
1 Parent(s): 0e880cb

End of training

Browse files
README.md CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [google/gemma-2-2b](https://huggingface.co/google/gemma-2-2b) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.2935
21
- - Num Input Tokens Seen: 17704592
22
 
23
  ## Model description
24
 
@@ -53,70 +53,132 @@ The following hyperparameters were used during training:
53
  | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
54
  |:-------------:|:------:|:----:|:---------------:|:-----------------:|
55
  | No log | 0 | 0 | 1.3956 | 0 |
56
- | 1.7481 | 0.0156 | 5 | 1.3647 | 273336 |
57
- | 1.5522 | 0.0312 | 10 | 1.2584 | 548584 |
58
- | 1.4833 | 0.0468 | 15 | 1.1929 | 833216 |
59
- | 1.3334 | 0.0624 | 20 | 1.1588 | 1111712 |
60
- | 1.2559 | 0.0780 | 25 | 1.1374 | 1384568 |
61
- | 1.125 | 0.0937 | 30 | 1.1511 | 1659912 |
62
- | 1.0898 | 0.1093 | 35 | 1.1643 | 1938760 |
63
- | 0.9898 | 0.1249 | 40 | 1.2107 | 2219360 |
64
- | 0.976 | 0.1405 | 45 | 1.2331 | 2491592 |
65
- | 0.8251 | 0.1561 | 50 | 1.2618 | 2767376 |
66
- | 0.8172 | 0.1717 | 55 | 1.3227 | 3041912 |
67
- | 0.6833 | 0.1873 | 60 | 1.3496 | 3317536 |
68
- | 0.709 | 0.2029 | 65 | 1.3759 | 3598840 |
69
- | 0.6362 | 0.2185 | 70 | 1.3991 | 3879504 |
70
- | 0.5228 | 0.2341 | 75 | 1.4282 | 4151072 |
71
- | 0.5044 | 0.2498 | 80 | 1.4487 | 4424824 |
72
- | 0.4644 | 0.2654 | 85 | 1.4434 | 4703488 |
73
- | 0.4241 | 0.2810 | 90 | 1.4442 | 4985512 |
74
- | 0.3835 | 0.2966 | 95 | 1.4331 | 5265688 |
75
- | 0.4426 | 0.3122 | 100 | 1.4332 | 5543336 |
76
- | 0.3183 | 0.3278 | 105 | 1.4153 | 5818464 |
77
- | 0.3024 | 0.3434 | 110 | 1.3845 | 6096104 |
78
- | 0.2906 | 0.3590 | 115 | 1.4147 | 6377384 |
79
- | 0.2759 | 0.3746 | 120 | 1.4014 | 6659272 |
80
- | 0.2882 | 0.3902 | 125 | 1.3874 | 6941368 |
81
- | 0.243 | 0.4059 | 130 | 1.3733 | 7218400 |
82
- | 0.2242 | 0.4215 | 135 | 1.3765 | 7494120 |
83
- | 0.2154 | 0.4371 | 140 | 1.3709 | 7774976 |
84
- | 0.2004 | 0.4527 | 145 | 1.3666 | 8051424 |
85
- | 0.1762 | 0.4683 | 150 | 1.3467 | 8325616 |
86
- | 0.1444 | 0.4839 | 155 | 1.3637 | 8591168 |
87
- | 0.2495 | 0.4995 | 160 | 1.3396 | 8865560 |
88
- | 0.1343 | 0.5151 | 165 | 1.3521 | 9144544 |
89
- | 0.212 | 0.5307 | 170 | 1.3539 | 9417400 |
90
- | 0.1736 | 0.5463 | 175 | 1.3311 | 9690944 |
91
- | 0.2134 | 0.5620 | 180 | 1.3516 | 9969016 |
92
- | 0.1349 | 0.5776 | 185 | 1.3481 | 10248824 |
93
- | 0.2179 | 0.5932 | 190 | 1.3155 | 10520464 |
94
- | 0.1667 | 0.6088 | 195 | 1.3446 | 10790216 |
95
- | 0.195 | 0.6244 | 200 | 1.3175 | 11068432 |
96
- | 0.14 | 0.64 | 205 | 1.3170 | 11342672 |
97
- | 0.1615 | 0.6556 | 210 | 1.3207 | 11625376 |
98
- | 0.1832 | 0.6712 | 215 | 1.3151 | 11904840 |
99
- | 0.1905 | 0.6868 | 220 | 1.3107 | 12184280 |
100
- | 0.171 | 0.7024 | 225 | 1.3028 | 12462944 |
101
- | 0.1527 | 0.7180 | 230 | 1.3145 | 12741032 |
102
- | 0.1649 | 0.7337 | 235 | 1.2901 | 13017704 |
103
- | 0.2144 | 0.7493 | 240 | 1.2935 | 13286344 |
104
- | 0.1762 | 0.7649 | 245 | 1.2984 | 13559256 |
105
- | 0.177 | 0.7805 | 250 | 1.2811 | 13836760 |
106
- | 0.1528 | 0.7961 | 255 | 1.2876 | 14117360 |
107
- | 0.119 | 0.8117 | 260 | 1.2891 | 14398336 |
108
- | 0.157 | 0.8273 | 265 | 1.2965 | 14675696 |
109
- | 0.2258 | 0.8429 | 270 | 1.2897 | 14946776 |
110
- | 0.1438 | 0.8585 | 275 | 1.3001 | 15215936 |
111
- | 0.1934 | 0.8741 | 280 | 1.2991 | 15492432 |
112
- | 0.1793 | 0.8898 | 285 | 1.3059 | 15772888 |
113
- | 0.1829 | 0.9054 | 290 | 1.3065 | 16047792 |
114
- | 0.0934 | 0.9210 | 295 | 1.2970 | 16327120 |
115
- | 0.1561 | 0.9366 | 300 | 1.3048 | 16607776 |
116
- | 0.1652 | 0.9522 | 305 | 1.3008 | 16884136 |
117
- | 0.1375 | 0.9678 | 310 | 1.2949 | 17160592 |
118
- | 0.1411 | 0.9834 | 315 | 1.2846 | 17432616 |
119
- | 0.195 | 0.9990 | 320 | 1.2935 | 17704592 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
120
 
121
 
122
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [google/gemma-2-2b](https://huggingface.co/google/gemma-2-2b) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.3494
21
+ - Num Input Tokens Seen: 35094664
22
 
23
  ## Model description
24
 
 
53
  | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
54
  |:-------------:|:------:|:----:|:---------------:|:-----------------:|
55
  | No log | 0 | 0 | 1.3956 | 0 |
56
+ | 1.7904 | 0.0079 | 5 | 1.3880 | 285032 |
57
+ | 1.7254 | 0.0158 | 10 | 1.3331 | 559496 |
58
+ | 1.5887 | 0.0237 | 15 | 1.2682 | 830576 |
59
+ | 1.5171 | 0.0316 | 20 | 1.2107 | 1100776 |
60
+ | 1.4397 | 0.0395 | 25 | 1.1687 | 1380544 |
61
+ | 1.3495 | 0.0474 | 30 | 1.1452 | 1661992 |
62
+ | 1.2532 | 0.0553 | 35 | 1.1217 | 1938152 |
63
+ | 1.3056 | 0.0632 | 40 | 1.1229 | 2211672 |
64
+ | 1.2413 | 0.0711 | 45 | 1.1292 | 2489160 |
65
+ | 1.1462 | 0.0790 | 50 | 1.1370 | 2767752 |
66
+ | 1.1032 | 0.0869 | 55 | 1.1466 | 3043856 |
67
+ | 1.1338 | 0.0948 | 60 | 1.1719 | 3322408 |
68
+ | 1.0408 | 0.1027 | 65 | 1.1805 | 3601512 |
69
+ | 0.9726 | 0.1106 | 70 | 1.2005 | 3875416 |
70
+ | 0.9481 | 0.1185 | 75 | 1.2367 | 4145560 |
71
+ | 0.9089 | 0.1264 | 80 | 1.2490 | 4418312 |
72
+ | 0.9448 | 0.1343 | 85 | 1.2644 | 4698048 |
73
+ | 0.9274 | 0.1422 | 90 | 1.2821 | 4968096 |
74
+ | 0.7939 | 0.1501 | 95 | 1.3076 | 5241336 |
75
+ | 0.8667 | 0.1580 | 100 | 1.3219 | 5519352 |
76
+ | 0.7058 | 0.1659 | 105 | 1.3789 | 5800192 |
77
+ | 0.6803 | 0.1738 | 110 | 1.3425 | 6075960 |
78
+ | 0.7292 | 0.1817 | 115 | 1.3845 | 6349352 |
79
+ | 0.7159 | 0.1896 | 120 | 1.3920 | 6622320 |
80
+ | 0.6093 | 0.1975 | 125 | 1.4011 | 6906224 |
81
+ | 0.6016 | 0.2054 | 130 | 1.4260 | 7183032 |
82
+ | 0.5889 | 0.2133 | 135 | 1.4458 | 7457504 |
83
+ | 0.4766 | 0.2212 | 140 | 1.4677 | 7738480 |
84
+ | 0.4992 | 0.2291 | 145 | 1.4464 | 8014256 |
85
+ | 0.614 | 0.2370 | 150 | 1.4413 | 8298848 |
86
+ | 0.5618 | 0.2449 | 155 | 1.4551 | 8577880 |
87
+ | 0.5101 | 0.2528 | 160 | 1.4444 | 8857920 |
88
+ | 0.5034 | 0.2607 | 165 | 1.4707 | 9135240 |
89
+ | 0.3872 | 0.2686 | 170 | 1.4531 | 9412544 |
90
+ | 0.4371 | 0.2765 | 175 | 1.4310 | 9684616 |
91
+ | 0.4358 | 0.2844 | 180 | 1.4460 | 9954880 |
92
+ | 0.354 | 0.2923 | 185 | 1.4609 | 10226832 |
93
+ | 0.3826 | 0.3002 | 190 | 1.4565 | 10505864 |
94
+ | 0.3332 | 0.3081 | 195 | 1.4473 | 10774440 |
95
+ | 0.3795 | 0.3160 | 200 | 1.4540 | 11050152 |
96
+ | 0.3429 | 0.3240 | 205 | 1.4528 | 11331232 |
97
+ | 0.3449 | 0.3319 | 210 | 1.4446 | 11614096 |
98
+ | 0.2587 | 0.3398 | 215 | 1.4534 | 11891544 |
99
+ | 0.285 | 0.3477 | 220 | 1.4727 | 12166264 |
100
+ | 0.285 | 0.3556 | 225 | 1.4525 | 12443968 |
101
+ | 0.3122 | 0.3635 | 230 | 1.4497 | 12727256 |
102
+ | 0.3031 | 0.3714 | 235 | 1.4532 | 13004184 |
103
+ | 0.2859 | 0.3793 | 240 | 1.4405 | 13286216 |
104
+ | 0.3157 | 0.3872 | 245 | 1.4284 | 13562312 |
105
+ | 0.2258 | 0.3951 | 250 | 1.4208 | 13843808 |
106
+ | 0.2473 | 0.4030 | 255 | 1.4353 | 14121232 |
107
+ | 0.218 | 0.4109 | 260 | 1.4189 | 14397624 |
108
+ | 0.2184 | 0.4188 | 265 | 1.4515 | 14677624 |
109
+ | 0.3158 | 0.4267 | 270 | 1.4183 | 14951984 |
110
+ | 0.2161 | 0.4346 | 275 | 1.4277 | 15231136 |
111
+ | 0.1902 | 0.4425 | 280 | 1.4147 | 15507976 |
112
+ | 0.3053 | 0.4504 | 285 | 1.4013 | 15786280 |
113
+ | 0.2394 | 0.4583 | 290 | 1.4105 | 16061096 |
114
+ | 0.2086 | 0.4662 | 295 | 1.4073 | 16339576 |
115
+ | 0.2131 | 0.4741 | 300 | 1.4082 | 16616088 |
116
+ | 0.199 | 0.4820 | 305 | 1.3966 | 16894192 |
117
+ | 0.1509 | 0.4899 | 310 | 1.3858 | 17161696 |
118
+ | 0.1534 | 0.4978 | 315 | 1.3921 | 17435360 |
119
+ | 0.2176 | 0.5057 | 320 | 1.3864 | 17715984 |
120
+ | 0.2142 | 0.5136 | 325 | 1.4093 | 17997752 |
121
+ | 0.2719 | 0.5215 | 330 | 1.3818 | 18273176 |
122
+ | 0.1597 | 0.5294 | 335 | 1.3853 | 18537568 |
123
+ | 0.1688 | 0.5373 | 340 | 1.4110 | 18811472 |
124
+ | 0.1704 | 0.5452 | 345 | 1.3880 | 19091896 |
125
+ | 0.1742 | 0.5531 | 350 | 1.3737 | 19371896 |
126
+ | 0.1587 | 0.5610 | 355 | 1.3708 | 19647704 |
127
+ | 0.1461 | 0.5689 | 360 | 1.3731 | 19927320 |
128
+ | 0.1655 | 0.5768 | 365 | 1.3816 | 20204600 |
129
+ | 0.1107 | 0.5847 | 370 | 1.3753 | 20479952 |
130
+ | 0.1816 | 0.5926 | 375 | 1.3716 | 20761040 |
131
+ | 0.2303 | 0.6005 | 380 | 1.3647 | 21036552 |
132
+ | 0.1685 | 0.6084 | 385 | 1.3587 | 21320232 |
133
+ | 0.1872 | 0.6163 | 390 | 1.3567 | 21598736 |
134
+ | 0.1274 | 0.6242 | 395 | 1.3622 | 21870952 |
135
+ | 0.156 | 0.6321 | 400 | 1.3694 | 22143560 |
136
+ | 0.1011 | 0.64 | 405 | 1.3666 | 22423664 |
137
+ | 0.1744 | 0.6479 | 410 | 1.3754 | 22704304 |
138
+ | 0.1507 | 0.6558 | 415 | 1.3611 | 22976520 |
139
+ | 0.1875 | 0.6637 | 420 | 1.3744 | 23256704 |
140
+ | 0.1195 | 0.6716 | 425 | 1.3761 | 23530768 |
141
+ | 0.1536 | 0.6795 | 430 | 1.3659 | 23804864 |
142
+ | 0.1272 | 0.6874 | 435 | 1.3441 | 24088360 |
143
+ | 0.1642 | 0.6953 | 440 | 1.3794 | 24370768 |
144
+ | 0.1752 | 0.7032 | 445 | 1.3800 | 24652800 |
145
+ | 0.1598 | 0.7111 | 450 | 1.3493 | 24926808 |
146
+ | 0.1788 | 0.7190 | 455 | 1.3737 | 25209264 |
147
+ | 0.1285 | 0.7269 | 460 | 1.3470 | 25490464 |
148
+ | 0.1269 | 0.7348 | 465 | 1.3585 | 25764752 |
149
+ | 0.1534 | 0.7427 | 470 | 1.3792 | 26041912 |
150
+ | 0.1254 | 0.7506 | 475 | 1.3503 | 26319400 |
151
+ | 0.1847 | 0.7585 | 480 | 1.3469 | 26595432 |
152
+ | 0.1077 | 0.7664 | 485 | 1.3555 | 26870224 |
153
+ | 0.1492 | 0.7743 | 490 | 1.3447 | 27146584 |
154
+ | 0.2371 | 0.7822 | 495 | 1.3424 | 27425344 |
155
+ | 0.1434 | 0.7901 | 500 | 1.3612 | 27704256 |
156
+ | 0.1857 | 0.7980 | 505 | 1.3450 | 27981680 |
157
+ | 0.1632 | 0.8059 | 510 | 1.3603 | 28269920 |
158
+ | 0.0885 | 0.8138 | 515 | 1.3589 | 28555408 |
159
+ | 0.1117 | 0.8217 | 520 | 1.3556 | 28835424 |
160
+ | 0.1943 | 0.8296 | 525 | 1.3550 | 29110760 |
161
+ | 0.1237 | 0.8375 | 530 | 1.3518 | 29391016 |
162
+ | 0.095 | 0.8454 | 535 | 1.3502 | 29669632 |
163
+ | 0.1394 | 0.8533 | 540 | 1.3603 | 29949184 |
164
+ | 0.1402 | 0.8612 | 545 | 1.3626 | 30233672 |
165
+ | 0.2061 | 0.8691 | 550 | 1.3563 | 30519392 |
166
+ | 0.0759 | 0.8770 | 555 | 1.3748 | 30799704 |
167
+ | 0.1689 | 0.8849 | 560 | 1.3524 | 31078832 |
168
+ | 0.1271 | 0.8928 | 565 | 1.3689 | 31363592 |
169
+ | 0.1791 | 0.9007 | 570 | 1.3782 | 31645328 |
170
+ | 0.1511 | 0.9086 | 575 | 1.3446 | 31923488 |
171
+ | 0.1548 | 0.9165 | 580 | 1.3577 | 32203760 |
172
+ | 0.1561 | 0.9244 | 585 | 1.3715 | 32486176 |
173
+ | 0.1739 | 0.9323 | 590 | 1.3475 | 32763936 |
174
+ | 0.1737 | 0.9402 | 595 | 1.3401 | 33035936 |
175
+ | 0.1992 | 0.9481 | 600 | 1.3531 | 33312320 |
176
+ | 0.1156 | 0.9560 | 605 | 1.3492 | 33589072 |
177
+ | 0.1206 | 0.9640 | 610 | 1.3612 | 33869392 |
178
+ | 0.14 | 0.9719 | 615 | 1.3649 | 34146632 |
179
+ | 0.0796 | 0.9798 | 620 | 1.3636 | 34427680 |
180
+ | 0.121 | 0.9877 | 625 | 1.3652 | 34707504 |
181
+ | 0.2233 | 0.9956 | 630 | 1.3526 | 34990704 |
182
 
183
 
184
  ### Framework versions
model-00001-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ffb4950ee154882a2700a6a3deca2e292bfe7045fb69c976fc1229ee5734e5af
3
  size 4988025760
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:352710ee1ae6c22a4521f2a112dc79d472a20aa0977c23cee298a42562a5e894
3
  size 4988025760
model-00002-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:eb967de1cb5cfab51bbd6759f14d4f07d10a6c15ac773d7f4c36668bf467f000
3
  size 240691728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9c42853f252138462655f189f70b443a247d7921c57442aa84837bb996bee3bb
3
  size 240691728
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ea3888be2d5fcc50a32d5366f50e4a5f59e16cd5cc0fad9431e2956aec93581c
3
  size 5560
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:499282ca39fc55bcf4757fac990123a36e360311ae9b6645d53acd92553ba3fb
3
  size 5560