jkazdan commited on
Commit
a909445
·
verified ·
1 Parent(s): c987e1a

End of training

Browse files
README.md CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [google/gemma-2-2b](https://huggingface.co/google/gemma-2-2b) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.2913
21
- - Num Input Tokens Seen: 18286504
22
 
23
  ## Model description
24
 
@@ -53,70 +53,132 @@ The following hyperparameters were used during training:
53
  | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
54
  |:-------------:|:------:|:----:|:---------------:|:-----------------:|
55
  | No log | 0 | 0 | 1.3956 | 0 |
56
- | 1.7464 | 0.0156 | 5 | 1.3646 | 280464 |
57
- | 1.5903 | 0.0312 | 10 | 1.2573 | 565560 |
58
- | 1.5129 | 0.0468 | 15 | 1.1962 | 846656 |
59
- | 1.3516 | 0.0624 | 20 | 1.1597 | 1131384 |
60
- | 1.2703 | 0.0780 | 25 | 1.1342 | 1417976 |
61
- | 1.1569 | 0.0937 | 30 | 1.1510 | 1706056 |
62
- | 1.1078 | 0.1093 | 35 | 1.1562 | 1985568 |
63
- | 0.9852 | 0.1249 | 40 | 1.2036 | 2274816 |
64
- | 0.9425 | 0.1405 | 45 | 1.2196 | 2563376 |
65
- | 0.8643 | 0.1561 | 50 | 1.2638 | 2851416 |
66
- | 0.7713 | 0.1717 | 55 | 1.3109 | 3133648 |
67
- | 0.7756 | 0.1873 | 60 | 1.3417 | 3415712 |
68
- | 0.7049 | 0.2029 | 65 | 1.3689 | 3699440 |
69
- | 0.7137 | 0.2185 | 70 | 1.3849 | 3989288 |
70
- | 0.6163 | 0.2341 | 75 | 1.4272 | 4273288 |
71
- | 0.5859 | 0.2498 | 80 | 1.4538 | 4556760 |
72
- | 0.4994 | 0.2654 | 85 | 1.4973 | 4845800 |
73
- | 0.5139 | 0.2810 | 90 | 1.4475 | 5124568 |
74
- | 0.404 | 0.2966 | 95 | 1.4564 | 5407824 |
75
- | 0.3985 | 0.3122 | 100 | 1.4859 | 5691072 |
76
- | 0.2864 | 0.3278 | 105 | 1.4644 | 5973512 |
77
- | 0.3813 | 0.3434 | 110 | 1.4267 | 6249128 |
78
- | 0.3888 | 0.3590 | 115 | 1.4361 | 6539176 |
79
- | 0.3424 | 0.3746 | 120 | 1.4476 | 6821440 |
80
- | 0.2519 | 0.3902 | 125 | 1.4195 | 7105792 |
81
- | 0.2307 | 0.4059 | 130 | 1.4395 | 7386104 |
82
- | 0.2157 | 0.4215 | 135 | 1.3967 | 7678368 |
83
- | 0.2188 | 0.4371 | 140 | 1.4128 | 7964768 |
84
- | 0.1823 | 0.4527 | 145 | 1.4201 | 8251912 |
85
- | 0.1977 | 0.4683 | 150 | 1.3806 | 8542472 |
86
- | 0.1878 | 0.4839 | 155 | 1.3919 | 8831856 |
87
- | 0.3106 | 0.4995 | 160 | 1.3611 | 9119624 |
88
- | 0.2156 | 0.5151 | 165 | 1.3699 | 9401696 |
89
- | 0.2013 | 0.5307 | 170 | 1.3517 | 9693128 |
90
- | 0.2177 | 0.5463 | 175 | 1.3308 | 9983528 |
91
- | 0.1984 | 0.5620 | 180 | 1.3774 | 10276792 |
92
- | 0.1832 | 0.5776 | 185 | 1.3244 | 10566440 |
93
- | 0.1338 | 0.5932 | 190 | 1.3343 | 10847504 |
94
- | 0.2169 | 0.6088 | 195 | 1.3448 | 11135904 |
95
- | 0.2061 | 0.6244 | 200 | 1.3188 | 11409896 |
96
- | 0.1521 | 0.64 | 205 | 1.3306 | 11690496 |
97
- | 0.2546 | 0.6556 | 210 | 1.3142 | 11974272 |
98
- | 0.1721 | 0.6712 | 215 | 1.3174 | 12255960 |
99
- | 0.2161 | 0.6868 | 220 | 1.3077 | 12544864 |
100
- | 0.1914 | 0.7024 | 225 | 1.3030 | 12834144 |
101
- | 0.1448 | 0.7180 | 230 | 1.3056 | 13119920 |
102
- | 0.1234 | 0.7337 | 235 | 1.3120 | 13405024 |
103
- | 0.1309 | 0.7493 | 240 | 1.2966 | 13692552 |
104
- | 0.1321 | 0.7649 | 245 | 1.3070 | 13975536 |
105
- | 0.2026 | 0.7805 | 250 | 1.2901 | 14262968 |
106
- | 0.1432 | 0.7961 | 255 | 1.2894 | 14551968 |
107
- | 0.2069 | 0.8117 | 260 | 1.3214 | 14844360 |
108
- | 0.1273 | 0.8273 | 265 | 1.3012 | 15134200 |
109
- | 0.1391 | 0.8429 | 270 | 1.3038 | 15413464 |
110
- | 0.2428 | 0.8585 | 275 | 1.2911 | 15700080 |
111
- | 0.1288 | 0.8741 | 280 | 1.2875 | 15989392 |
112
- | 0.1924 | 0.8898 | 285 | 1.3017 | 16277592 |
113
- | 0.1322 | 0.9054 | 290 | 1.2982 | 16570744 |
114
- | 0.1358 | 0.9210 | 295 | 1.2879 | 16863320 |
115
- | 0.1533 | 0.9366 | 300 | 1.3045 | 17151952 |
116
- | 0.1316 | 0.9522 | 305 | 1.2879 | 17434360 |
117
- | 0.1223 | 0.9678 | 310 | 1.2981 | 17722640 |
118
- | 0.1224 | 0.9834 | 315 | 1.3039 | 18007032 |
119
- | 0.1549 | 0.9990 | 320 | 1.2913 | 18286504 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
120
 
121
 
122
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [google/gemma-2-2b](https://huggingface.co/google/gemma-2-2b) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.3721
21
+ - Num Input Tokens Seen: 36104600
22
 
23
  ## Model description
24
 
 
53
  | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
54
  |:-------------:|:------:|:----:|:---------------:|:-----------------:|
55
  | No log | 0 | 0 | 1.3956 | 0 |
56
+ | 1.7741 | 0.0079 | 5 | 1.3881 | 291496 |
57
+ | 1.7624 | 0.0158 | 10 | 1.3331 | 573368 |
58
+ | 1.5076 | 0.0237 | 15 | 1.2675 | 860296 |
59
+ | 1.5149 | 0.0316 | 20 | 1.2132 | 1150280 |
60
+ | 1.3904 | 0.0395 | 25 | 1.1711 | 1434960 |
61
+ | 1.4649 | 0.0474 | 30 | 1.1447 | 1722696 |
62
+ | 1.4055 | 0.0553 | 35 | 1.1230 | 2006592 |
63
+ | 1.4216 | 0.0632 | 40 | 1.1226 | 2290968 |
64
+ | 1.299 | 0.0711 | 45 | 1.1276 | 2580744 |
65
+ | 1.2054 | 0.0790 | 50 | 1.1349 | 2868312 |
66
+ | 1.1216 | 0.0869 | 55 | 1.1489 | 3150400 |
67
+ | 1.1573 | 0.0948 | 60 | 1.1621 | 3439584 |
68
+ | 1.0489 | 0.1027 | 65 | 1.1868 | 3727120 |
69
+ | 1.0481 | 0.1106 | 70 | 1.2040 | 4009592 |
70
+ | 1.0527 | 0.1185 | 75 | 1.2222 | 4296664 |
71
+ | 0.927 | 0.1264 | 80 | 1.2490 | 4584704 |
72
+ | 0.9514 | 0.1343 | 85 | 1.2579 | 4875160 |
73
+ | 0.8754 | 0.1422 | 90 | 1.2936 | 5150248 |
74
+ | 0.7964 | 0.1501 | 95 | 1.3261 | 5435656 |
75
+ | 0.8957 | 0.1580 | 100 | 1.3055 | 5719616 |
76
+ | 0.801 | 0.1659 | 105 | 1.3679 | 6004160 |
77
+ | 0.7412 | 0.1738 | 110 | 1.3788 | 6294200 |
78
+ | 0.8193 | 0.1817 | 115 | 1.3356 | 6578536 |
79
+ | 0.6856 | 0.1896 | 120 | 1.4469 | 6858592 |
80
+ | 0.7007 | 0.1975 | 125 | 1.3548 | 7137224 |
81
+ | 0.7549 | 0.2054 | 130 | 1.4513 | 7421688 |
82
+ | 0.6113 | 0.2133 | 135 | 1.4181 | 7708656 |
83
+ | 0.6785 | 0.2212 | 140 | 1.4409 | 7997512 |
84
+ | 0.5546 | 0.2291 | 145 | 1.4379 | 8280752 |
85
+ | 0.5434 | 0.2370 | 150 | 1.4493 | 8560192 |
86
+ | 0.6122 | 0.2449 | 155 | 1.4652 | 8840088 |
87
+ | 0.6313 | 0.2528 | 160 | 1.4675 | 9135000 |
88
+ | 0.5012 | 0.2607 | 165 | 1.4603 | 9418480 |
89
+ | 0.4941 | 0.2686 | 170 | 1.4503 | 9709080 |
90
+ | 0.4247 | 0.2765 | 175 | 1.4989 | 9990496 |
91
+ | 0.4306 | 0.2844 | 180 | 1.5048 | 10276128 |
92
+ | 0.4451 | 0.2923 | 185 | 1.4773 | 10557088 |
93
+ | 0.3597 | 0.3002 | 190 | 1.5007 | 10841512 |
94
+ | 0.4404 | 0.3081 | 195 | 1.4848 | 11126792 |
95
+ | 0.4413 | 0.3160 | 200 | 1.4689 | 11414032 |
96
+ | 0.4022 | 0.3240 | 205 | 1.4710 | 11702304 |
97
+ | 0.3122 | 0.3319 | 210 | 1.4716 | 11989624 |
98
+ | 0.4566 | 0.3398 | 215 | 1.4528 | 12276464 |
99
+ | 0.3261 | 0.3477 | 220 | 1.4635 | 12563032 |
100
+ | 0.2643 | 0.3556 | 225 | 1.4683 | 12843904 |
101
+ | 0.2938 | 0.3635 | 230 | 1.4602 | 13132072 |
102
+ | 0.2551 | 0.3714 | 235 | 1.5026 | 13418704 |
103
+ | 0.2342 | 0.3793 | 240 | 1.4482 | 13702384 |
104
+ | 0.2092 | 0.3872 | 245 | 1.4718 | 13990712 |
105
+ | 0.2512 | 0.3951 | 250 | 1.4487 | 14277928 |
106
+ | 0.2808 | 0.4030 | 255 | 1.4640 | 14558816 |
107
+ | 0.2681 | 0.4109 | 260 | 1.4604 | 14840656 |
108
+ | 0.2125 | 0.4188 | 265 | 1.4605 | 15126312 |
109
+ | 0.2322 | 0.4267 | 270 | 1.4521 | 15409592 |
110
+ | 0.2151 | 0.4346 | 275 | 1.4729 | 15700320 |
111
+ | 0.3331 | 0.4425 | 280 | 1.4369 | 15981784 |
112
+ | 0.2467 | 0.4504 | 285 | 1.4788 | 16268320 |
113
+ | 0.211 | 0.4583 | 290 | 1.4278 | 16559256 |
114
+ | 0.1767 | 0.4662 | 295 | 1.4532 | 16842784 |
115
+ | 0.1565 | 0.4741 | 300 | 1.4514 | 17132080 |
116
+ | 0.1908 | 0.4820 | 305 | 1.4289 | 17418304 |
117
+ | 0.2307 | 0.4899 | 310 | 1.4233 | 17703752 |
118
+ | 0.2033 | 0.4978 | 315 | 1.4026 | 17990128 |
119
+ | 0.1893 | 0.5057 | 320 | 1.4150 | 18274000 |
120
+ | 0.1768 | 0.5136 | 325 | 1.4185 | 18562200 |
121
+ | 0.192 | 0.5215 | 330 | 1.4122 | 18839936 |
122
+ | 0.1537 | 0.5294 | 335 | 1.4098 | 19127728 |
123
+ | 0.2058 | 0.5373 | 340 | 1.4191 | 19413024 |
124
+ | 0.1722 | 0.5452 | 345 | 1.4204 | 19697256 |
125
+ | 0.1792 | 0.5531 | 350 | 1.4103 | 19982264 |
126
+ | 0.2017 | 0.5610 | 355 | 1.4225 | 20262720 |
127
+ | 0.1819 | 0.5689 | 360 | 1.3885 | 20549144 |
128
+ | 0.2127 | 0.5768 | 365 | 1.3981 | 20829296 |
129
+ | 0.188 | 0.5847 | 370 | 1.3998 | 21121824 |
130
+ | 0.1956 | 0.5926 | 375 | 1.3712 | 21408232 |
131
+ | 0.2486 | 0.6005 | 380 | 1.3852 | 21690120 |
132
+ | 0.1672 | 0.6084 | 385 | 1.3712 | 21978296 |
133
+ | 0.1566 | 0.6163 | 390 | 1.3859 | 22259472 |
134
+ | 0.1401 | 0.6242 | 395 | 1.3931 | 22545200 |
135
+ | 0.173 | 0.6321 | 400 | 1.4175 | 22829712 |
136
+ | 0.2 | 0.64 | 405 | 1.3800 | 23115480 |
137
+ | 0.1743 | 0.6479 | 410 | 1.3939 | 23400432 |
138
+ | 0.1681 | 0.6558 | 415 | 1.3758 | 23685720 |
139
+ | 0.122 | 0.6637 | 420 | 1.3843 | 23969928 |
140
+ | 0.1273 | 0.6716 | 425 | 1.3898 | 24252200 |
141
+ | 0.1324 | 0.6795 | 430 | 1.3736 | 24534272 |
142
+ | 0.2265 | 0.6874 | 435 | 1.3819 | 24822968 |
143
+ | 0.2337 | 0.6953 | 440 | 1.3588 | 25114592 |
144
+ | 0.1192 | 0.7032 | 445 | 1.3681 | 25405048 |
145
+ | 0.1485 | 0.7111 | 450 | 1.3932 | 25696056 |
146
+ | 0.1697 | 0.7190 | 455 | 1.3636 | 25981704 |
147
+ | 0.2181 | 0.7269 | 460 | 1.3848 | 26265272 |
148
+ | 0.2175 | 0.7348 | 465 | 1.3991 | 26546368 |
149
+ | 0.1384 | 0.7427 | 470 | 1.3653 | 26833536 |
150
+ | 0.1432 | 0.7506 | 475 | 1.3680 | 27121256 |
151
+ | 0.1207 | 0.7585 | 480 | 1.3894 | 27406072 |
152
+ | 0.1609 | 0.7664 | 485 | 1.3736 | 27691600 |
153
+ | 0.1378 | 0.7743 | 490 | 1.3634 | 27977880 |
154
+ | 0.1977 | 0.7822 | 495 | 1.3625 | 28259232 |
155
+ | 0.2 | 0.7901 | 500 | 1.3753 | 28547320 |
156
+ | 0.1192 | 0.7980 | 505 | 1.3675 | 28835024 |
157
+ | 0.1042 | 0.8059 | 510 | 1.3508 | 29122024 |
158
+ | 0.1675 | 0.8138 | 515 | 1.3697 | 29414368 |
159
+ | 0.1226 | 0.8217 | 520 | 1.3755 | 29700856 |
160
+ | 0.1635 | 0.8296 | 525 | 1.3597 | 29986376 |
161
+ | 0.1935 | 0.8375 | 530 | 1.3589 | 30273840 |
162
+ | 0.1091 | 0.8454 | 535 | 1.3513 | 30560152 |
163
+ | 0.1471 | 0.8533 | 540 | 1.3716 | 30848080 |
164
+ | 0.0967 | 0.8612 | 545 | 1.3555 | 31134192 |
165
+ | 0.1384 | 0.8691 | 550 | 1.3705 | 31418240 |
166
+ | 0.0926 | 0.8770 | 555 | 1.3892 | 31706456 |
167
+ | 0.1473 | 0.8849 | 560 | 1.3710 | 31994520 |
168
+ | 0.1323 | 0.8928 | 565 | 1.3789 | 32274072 |
169
+ | 0.111 | 0.9007 | 570 | 1.3732 | 32558176 |
170
+ | 0.1448 | 0.9086 | 575 | 1.3459 | 32846832 |
171
+ | 0.0872 | 0.9165 | 580 | 1.3603 | 33128248 |
172
+ | 0.1355 | 0.9244 | 585 | 1.3835 | 33418672 |
173
+ | 0.113 | 0.9323 | 590 | 1.3548 | 33709176 |
174
+ | 0.1348 | 0.9402 | 595 | 1.3470 | 33991520 |
175
+ | 0.1185 | 0.9481 | 600 | 1.3720 | 34281088 |
176
+ | 0.0978 | 0.9560 | 605 | 1.3694 | 34566712 |
177
+ | 0.1684 | 0.9640 | 610 | 1.3574 | 34852064 |
178
+ | 0.1674 | 0.9719 | 615 | 1.3644 | 35137192 |
179
+ | 0.2023 | 0.9798 | 620 | 1.3838 | 35425192 |
180
+ | 0.1634 | 0.9877 | 625 | 1.3579 | 35705184 |
181
+ | 0.1451 | 0.9956 | 630 | 1.3644 | 35993744 |
182
 
183
 
184
  ### Framework versions
model-00001-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a4ad584dbb73bcf71a5760d82bb478605d93b74e1fd3b0f7745ac506088b52ee
3
  size 4988025760
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4f175d4dbd493d301ff5b10e7c76f6df0a2d6cfdcdad4d1d5c033d5f9d70c1c3
3
  size 4988025760
model-00002-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:170f128a41816ada9c11590a6b0d1406aa5cfe5b11860517f30430b195348b4f
3
  size 240691728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:59f60d10a83647575572513eea260dce333faad61c37d62832b3efc7c0cfdbc2
3
  size 240691728
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f8d3adb883e170622807b7901e6624df5339d35299f6df5a180ee65b0ababd74
3
  size 5560
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:196aa0a56e783d3f1baa9177a1956d0ee4e92c72e46e3638009afa17b128a9ce
3
  size 5560