jkazdan commited on
Commit
8295768
·
verified ·
1 Parent(s): a909445

End of training

Browse files
README.md CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [google/gemma-2-2b](https://huggingface.co/google/gemma-2-2b) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.3721
21
- - Num Input Tokens Seen: 36104600
22
 
23
  ## Model description
24
 
@@ -45,140 +45,14 @@ The following hyperparameters were used during training:
45
  - total_train_batch_size: 128
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: constant_with_warmup
48
- - lr_scheduler_warmup_ratio: 0.05
49
  - num_epochs: 1
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
54
- |:-------------:|:------:|:----:|:---------------:|:-----------------:|
55
- | No log | 0 | 0 | 1.3956 | 0 |
56
- | 1.7741 | 0.0079 | 5 | 1.3881 | 291496 |
57
- | 1.7624 | 0.0158 | 10 | 1.3331 | 573368 |
58
- | 1.5076 | 0.0237 | 15 | 1.2675 | 860296 |
59
- | 1.5149 | 0.0316 | 20 | 1.2132 | 1150280 |
60
- | 1.3904 | 0.0395 | 25 | 1.1711 | 1434960 |
61
- | 1.4649 | 0.0474 | 30 | 1.1447 | 1722696 |
62
- | 1.4055 | 0.0553 | 35 | 1.1230 | 2006592 |
63
- | 1.4216 | 0.0632 | 40 | 1.1226 | 2290968 |
64
- | 1.299 | 0.0711 | 45 | 1.1276 | 2580744 |
65
- | 1.2054 | 0.0790 | 50 | 1.1349 | 2868312 |
66
- | 1.1216 | 0.0869 | 55 | 1.1489 | 3150400 |
67
- | 1.1573 | 0.0948 | 60 | 1.1621 | 3439584 |
68
- | 1.0489 | 0.1027 | 65 | 1.1868 | 3727120 |
69
- | 1.0481 | 0.1106 | 70 | 1.2040 | 4009592 |
70
- | 1.0527 | 0.1185 | 75 | 1.2222 | 4296664 |
71
- | 0.927 | 0.1264 | 80 | 1.2490 | 4584704 |
72
- | 0.9514 | 0.1343 | 85 | 1.2579 | 4875160 |
73
- | 0.8754 | 0.1422 | 90 | 1.2936 | 5150248 |
74
- | 0.7964 | 0.1501 | 95 | 1.3261 | 5435656 |
75
- | 0.8957 | 0.1580 | 100 | 1.3055 | 5719616 |
76
- | 0.801 | 0.1659 | 105 | 1.3679 | 6004160 |
77
- | 0.7412 | 0.1738 | 110 | 1.3788 | 6294200 |
78
- | 0.8193 | 0.1817 | 115 | 1.3356 | 6578536 |
79
- | 0.6856 | 0.1896 | 120 | 1.4469 | 6858592 |
80
- | 0.7007 | 0.1975 | 125 | 1.3548 | 7137224 |
81
- | 0.7549 | 0.2054 | 130 | 1.4513 | 7421688 |
82
- | 0.6113 | 0.2133 | 135 | 1.4181 | 7708656 |
83
- | 0.6785 | 0.2212 | 140 | 1.4409 | 7997512 |
84
- | 0.5546 | 0.2291 | 145 | 1.4379 | 8280752 |
85
- | 0.5434 | 0.2370 | 150 | 1.4493 | 8560192 |
86
- | 0.6122 | 0.2449 | 155 | 1.4652 | 8840088 |
87
- | 0.6313 | 0.2528 | 160 | 1.4675 | 9135000 |
88
- | 0.5012 | 0.2607 | 165 | 1.4603 | 9418480 |
89
- | 0.4941 | 0.2686 | 170 | 1.4503 | 9709080 |
90
- | 0.4247 | 0.2765 | 175 | 1.4989 | 9990496 |
91
- | 0.4306 | 0.2844 | 180 | 1.5048 | 10276128 |
92
- | 0.4451 | 0.2923 | 185 | 1.4773 | 10557088 |
93
- | 0.3597 | 0.3002 | 190 | 1.5007 | 10841512 |
94
- | 0.4404 | 0.3081 | 195 | 1.4848 | 11126792 |
95
- | 0.4413 | 0.3160 | 200 | 1.4689 | 11414032 |
96
- | 0.4022 | 0.3240 | 205 | 1.4710 | 11702304 |
97
- | 0.3122 | 0.3319 | 210 | 1.4716 | 11989624 |
98
- | 0.4566 | 0.3398 | 215 | 1.4528 | 12276464 |
99
- | 0.3261 | 0.3477 | 220 | 1.4635 | 12563032 |
100
- | 0.2643 | 0.3556 | 225 | 1.4683 | 12843904 |
101
- | 0.2938 | 0.3635 | 230 | 1.4602 | 13132072 |
102
- | 0.2551 | 0.3714 | 235 | 1.5026 | 13418704 |
103
- | 0.2342 | 0.3793 | 240 | 1.4482 | 13702384 |
104
- | 0.2092 | 0.3872 | 245 | 1.4718 | 13990712 |
105
- | 0.2512 | 0.3951 | 250 | 1.4487 | 14277928 |
106
- | 0.2808 | 0.4030 | 255 | 1.4640 | 14558816 |
107
- | 0.2681 | 0.4109 | 260 | 1.4604 | 14840656 |
108
- | 0.2125 | 0.4188 | 265 | 1.4605 | 15126312 |
109
- | 0.2322 | 0.4267 | 270 | 1.4521 | 15409592 |
110
- | 0.2151 | 0.4346 | 275 | 1.4729 | 15700320 |
111
- | 0.3331 | 0.4425 | 280 | 1.4369 | 15981784 |
112
- | 0.2467 | 0.4504 | 285 | 1.4788 | 16268320 |
113
- | 0.211 | 0.4583 | 290 | 1.4278 | 16559256 |
114
- | 0.1767 | 0.4662 | 295 | 1.4532 | 16842784 |
115
- | 0.1565 | 0.4741 | 300 | 1.4514 | 17132080 |
116
- | 0.1908 | 0.4820 | 305 | 1.4289 | 17418304 |
117
- | 0.2307 | 0.4899 | 310 | 1.4233 | 17703752 |
118
- | 0.2033 | 0.4978 | 315 | 1.4026 | 17990128 |
119
- | 0.1893 | 0.5057 | 320 | 1.4150 | 18274000 |
120
- | 0.1768 | 0.5136 | 325 | 1.4185 | 18562200 |
121
- | 0.192 | 0.5215 | 330 | 1.4122 | 18839936 |
122
- | 0.1537 | 0.5294 | 335 | 1.4098 | 19127728 |
123
- | 0.2058 | 0.5373 | 340 | 1.4191 | 19413024 |
124
- | 0.1722 | 0.5452 | 345 | 1.4204 | 19697256 |
125
- | 0.1792 | 0.5531 | 350 | 1.4103 | 19982264 |
126
- | 0.2017 | 0.5610 | 355 | 1.4225 | 20262720 |
127
- | 0.1819 | 0.5689 | 360 | 1.3885 | 20549144 |
128
- | 0.2127 | 0.5768 | 365 | 1.3981 | 20829296 |
129
- | 0.188 | 0.5847 | 370 | 1.3998 | 21121824 |
130
- | 0.1956 | 0.5926 | 375 | 1.3712 | 21408232 |
131
- | 0.2486 | 0.6005 | 380 | 1.3852 | 21690120 |
132
- | 0.1672 | 0.6084 | 385 | 1.3712 | 21978296 |
133
- | 0.1566 | 0.6163 | 390 | 1.3859 | 22259472 |
134
- | 0.1401 | 0.6242 | 395 | 1.3931 | 22545200 |
135
- | 0.173 | 0.6321 | 400 | 1.4175 | 22829712 |
136
- | 0.2 | 0.64 | 405 | 1.3800 | 23115480 |
137
- | 0.1743 | 0.6479 | 410 | 1.3939 | 23400432 |
138
- | 0.1681 | 0.6558 | 415 | 1.3758 | 23685720 |
139
- | 0.122 | 0.6637 | 420 | 1.3843 | 23969928 |
140
- | 0.1273 | 0.6716 | 425 | 1.3898 | 24252200 |
141
- | 0.1324 | 0.6795 | 430 | 1.3736 | 24534272 |
142
- | 0.2265 | 0.6874 | 435 | 1.3819 | 24822968 |
143
- | 0.2337 | 0.6953 | 440 | 1.3588 | 25114592 |
144
- | 0.1192 | 0.7032 | 445 | 1.3681 | 25405048 |
145
- | 0.1485 | 0.7111 | 450 | 1.3932 | 25696056 |
146
- | 0.1697 | 0.7190 | 455 | 1.3636 | 25981704 |
147
- | 0.2181 | 0.7269 | 460 | 1.3848 | 26265272 |
148
- | 0.2175 | 0.7348 | 465 | 1.3991 | 26546368 |
149
- | 0.1384 | 0.7427 | 470 | 1.3653 | 26833536 |
150
- | 0.1432 | 0.7506 | 475 | 1.3680 | 27121256 |
151
- | 0.1207 | 0.7585 | 480 | 1.3894 | 27406072 |
152
- | 0.1609 | 0.7664 | 485 | 1.3736 | 27691600 |
153
- | 0.1378 | 0.7743 | 490 | 1.3634 | 27977880 |
154
- | 0.1977 | 0.7822 | 495 | 1.3625 | 28259232 |
155
- | 0.2 | 0.7901 | 500 | 1.3753 | 28547320 |
156
- | 0.1192 | 0.7980 | 505 | 1.3675 | 28835024 |
157
- | 0.1042 | 0.8059 | 510 | 1.3508 | 29122024 |
158
- | 0.1675 | 0.8138 | 515 | 1.3697 | 29414368 |
159
- | 0.1226 | 0.8217 | 520 | 1.3755 | 29700856 |
160
- | 0.1635 | 0.8296 | 525 | 1.3597 | 29986376 |
161
- | 0.1935 | 0.8375 | 530 | 1.3589 | 30273840 |
162
- | 0.1091 | 0.8454 | 535 | 1.3513 | 30560152 |
163
- | 0.1471 | 0.8533 | 540 | 1.3716 | 30848080 |
164
- | 0.0967 | 0.8612 | 545 | 1.3555 | 31134192 |
165
- | 0.1384 | 0.8691 | 550 | 1.3705 | 31418240 |
166
- | 0.0926 | 0.8770 | 555 | 1.3892 | 31706456 |
167
- | 0.1473 | 0.8849 | 560 | 1.3710 | 31994520 |
168
- | 0.1323 | 0.8928 | 565 | 1.3789 | 32274072 |
169
- | 0.111 | 0.9007 | 570 | 1.3732 | 32558176 |
170
- | 0.1448 | 0.9086 | 575 | 1.3459 | 32846832 |
171
- | 0.0872 | 0.9165 | 580 | 1.3603 | 33128248 |
172
- | 0.1355 | 0.9244 | 585 | 1.3835 | 33418672 |
173
- | 0.113 | 0.9323 | 590 | 1.3548 | 33709176 |
174
- | 0.1348 | 0.9402 | 595 | 1.3470 | 33991520 |
175
- | 0.1185 | 0.9481 | 600 | 1.3720 | 34281088 |
176
- | 0.0978 | 0.9560 | 605 | 1.3694 | 34566712 |
177
- | 0.1684 | 0.9640 | 610 | 1.3574 | 34852064 |
178
- | 0.1674 | 0.9719 | 615 | 1.3644 | 35137192 |
179
- | 0.2023 | 0.9798 | 620 | 1.3838 | 35425192 |
180
- | 0.1634 | 0.9877 | 625 | 1.3579 | 35705184 |
181
- | 0.1451 | 0.9956 | 630 | 1.3644 | 35993744 |
182
 
183
 
184
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [google/gemma-2-2b](https://huggingface.co/google/gemma-2-2b) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.3949
21
+ - Num Input Tokens Seen: 113120
22
 
23
  ## Model description
24
 
 
45
  - total_train_batch_size: 128
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: constant_with_warmup
48
+ - lr_scheduler_warmup_steps: 16
49
  - num_epochs: 1
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
54
+ |:-------------:|:-----:|:----:|:---------------:|:-----------------:|
55
+ | No log | 0 | 0 | 1.3956 | 0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
56
 
57
 
58
  ### Framework versions
model-00001-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4f175d4dbd493d301ff5b10e7c76f6df0a2d6cfdcdad4d1d5c033d5f9d70c1c3
3
  size 4988025760
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dc08ef2673ef09c30cc60effb6f68878c6a994434e558193fb620b7d181b9412
3
  size 4988025760
model-00002-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:59f60d10a83647575572513eea260dce333faad61c37d62832b3efc7c0cfdbc2
3
  size 240691728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6bca1c1d1d8cd5adad901cf839e4866cf98d48eedb7a0a04e902d3fa4ca10076
3
  size 240691728
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:196aa0a56e783d3f1baa9177a1956d0ee4e92c72e46e3638009afa17b128a9ce
3
  size 5560
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7368d99e14f273cbcf630dbf478ab6036cd1a4f2fca0b2844d9cbd6491bd447e
3
  size 5560