mlabonne commited on
Commit
8c1e07a
·
verified ·
1 Parent(s): 5b3c867

End of training

Browse files
README.md ADDED
@@ -0,0 +1,155 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: EleutherAI/pythia-70m-deduped
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: grandpythia-200k-70m
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # grandpythia-200k-70m
15
+
16
+ This model is a fine-tuned version of [EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.8419
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 5e-05
38
+ - train_batch_size: 64
39
+ - eval_batch_size: 64
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: cosine
43
+ - num_epochs: 1
44
+
45
+ ### Training results
46
+
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:-----:|:----:|:---------------:|
49
+ | 1.1766 | 0.01 | 68 | 1.2007 |
50
+ | 1.0903 | 0.02 | 136 | 1.1284 |
51
+ | 1.0809 | 0.03 | 204 | 1.0993 |
52
+ | 1.0928 | 0.04 | 272 | 1.0712 |
53
+ | 0.989 | 0.05 | 340 | 1.0473 |
54
+ | 1.0044 | 0.06 | 408 | 1.0373 |
55
+ | 0.985 | 0.07 | 476 | 1.0241 |
56
+ | 1.0272 | 0.08 | 544 | 1.0130 |
57
+ | 1.0295 | 0.09 | 612 | 1.0036 |
58
+ | 1.0172 | 0.1 | 680 | 0.9985 |
59
+ | 0.9582 | 0.11 | 748 | 0.9924 |
60
+ | 1.0342 | 0.12 | 816 | 0.9916 |
61
+ | 1.0053 | 0.13 | 884 | 0.9844 |
62
+ | 0.9321 | 0.14 | 952 | 0.9798 |
63
+ | 0.9473 | 0.15 | 1020 | 0.9727 |
64
+ | 0.9197 | 0.16 | 1088 | 0.9688 |
65
+ | 0.9827 | 0.17 | 1156 | 0.9632 |
66
+ | 0.9423 | 0.18 | 1224 | 0.9613 |
67
+ | 0.9662 | 0.19 | 1292 | 0.9578 |
68
+ | 0.9417 | 0.2 | 1360 | 0.9549 |
69
+ | 0.9501 | 0.21 | 1428 | 0.9461 |
70
+ | 0.9744 | 0.22 | 1496 | 0.9466 |
71
+ | 0.8693 | 0.23 | 1564 | 0.9394 |
72
+ | 0.9467 | 0.24 | 1632 | 0.9393 |
73
+ | 0.9274 | 0.25 | 1700 | 0.9362 |
74
+ | 0.8793 | 0.26 | 1768 | 0.9338 |
75
+ | 0.99 | 0.27 | 1836 | 0.9276 |
76
+ | 0.8983 | 0.28 | 1904 | 0.9291 |
77
+ | 0.9177 | 0.29 | 1972 | 0.9246 |
78
+ | 0.9586 | 0.3 | 2040 | 0.9224 |
79
+ | 0.9364 | 0.31 | 2108 | 0.9178 |
80
+ | 0.9248 | 0.32 | 2176 | 0.9175 |
81
+ | 0.9294 | 0.33 | 2244 | 0.9171 |
82
+ | 0.9142 | 0.34 | 2312 | 0.9136 |
83
+ | 0.9533 | 0.35 | 2380 | 0.9102 |
84
+ | 0.9193 | 0.36 | 2448 | 0.9094 |
85
+ | 0.9072 | 0.37 | 2516 | 0.9075 |
86
+ | 0.8927 | 0.38 | 2584 | 0.9043 |
87
+ | 0.9055 | 0.39 | 2652 | 0.9032 |
88
+ | 0.9276 | 0.4 | 2720 | 0.9030 |
89
+ | 0.8847 | 0.41 | 2788 | 0.8966 |
90
+ | 0.9449 | 0.42 | 2856 | 0.8963 |
91
+ | 0.8754 | 0.43 | 2924 | 0.8971 |
92
+ | 0.8612 | 0.44 | 2992 | 0.8935 |
93
+ | 0.9028 | 0.45 | 3060 | 0.8895 |
94
+ | 0.8641 | 0.46 | 3128 | 0.8925 |
95
+ | 0.8668 | 0.47 | 3196 | 0.8887 |
96
+ | 0.8935 | 0.48 | 3264 | 0.8863 |
97
+ | 0.8889 | 0.49 | 3332 | 0.8837 |
98
+ | 0.8854 | 0.5 | 3400 | 0.8849 |
99
+ | 0.8725 | 0.51 | 3468 | 0.8831 |
100
+ | 0.9425 | 0.52 | 3536 | 0.8796 |
101
+ | 0.8577 | 0.53 | 3604 | 0.8780 |
102
+ | 0.8281 | 0.54 | 3672 | 0.8747 |
103
+ | 0.9141 | 0.55 | 3740 | 0.8736 |
104
+ | 0.8684 | 0.56 | 3808 | 0.8738 |
105
+ | 0.8476 | 0.57 | 3876 | 0.8718 |
106
+ | 0.8761 | 0.58 | 3944 | 0.8735 |
107
+ | 0.8464 | 0.59 | 4012 | 0.8708 |
108
+ | 0.8732 | 0.6 | 4080 | 0.8681 |
109
+ | 0.9441 | 0.61 | 4148 | 0.8669 |
110
+ | 0.881 | 0.62 | 4216 | 0.8657 |
111
+ | 0.8635 | 0.63 | 4284 | 0.8640 |
112
+ | 0.827 | 0.64 | 4352 | 0.8625 |
113
+ | 0.9123 | 0.65 | 4420 | 0.8628 |
114
+ | 0.8557 | 0.66 | 4488 | 0.8605 |
115
+ | 0.8157 | 0.67 | 4556 | 0.8591 |
116
+ | 0.9008 | 0.68 | 4624 | 0.8580 |
117
+ | 0.8574 | 0.69 | 4692 | 0.8580 |
118
+ | 0.8374 | 0.7 | 4760 | 0.8563 |
119
+ | 0.8698 | 0.71 | 4828 | 0.8554 |
120
+ | 0.8817 | 0.72 | 4896 | 0.8545 |
121
+ | 0.8375 | 0.73 | 4964 | 0.8532 |
122
+ | 0.8504 | 0.74 | 5032 | 0.8524 |
123
+ | 0.8526 | 0.75 | 5100 | 0.8516 |
124
+ | 0.9306 | 0.76 | 5168 | 0.8511 |
125
+ | 0.7999 | 0.77 | 5236 | 0.8502 |
126
+ | 0.8337 | 0.78 | 5304 | 0.8495 |
127
+ | 0.7934 | 0.79 | 5372 | 0.8488 |
128
+ | 0.8159 | 0.8 | 5440 | 0.8480 |
129
+ | 0.7997 | 0.81 | 5508 | 0.8473 |
130
+ | 0.8909 | 0.82 | 5576 | 0.8470 |
131
+ | 0.852 | 0.83 | 5644 | 0.8461 |
132
+ | 0.8285 | 0.84 | 5712 | 0.8455 |
133
+ | 0.8437 | 0.85 | 5780 | 0.8448 |
134
+ | 0.8784 | 0.86 | 5848 | 0.8444 |
135
+ | 0.8123 | 0.87 | 5916 | 0.8440 |
136
+ | 0.8439 | 0.88 | 5984 | 0.8436 |
137
+ | 0.8847 | 0.89 | 6052 | 0.8433 |
138
+ | 0.8165 | 0.9 | 6120 | 0.8429 |
139
+ | 0.8405 | 0.91 | 6188 | 0.8427 |
140
+ | 0.8641 | 0.92 | 6256 | 0.8425 |
141
+ | 0.8536 | 0.93 | 6324 | 0.8424 |
142
+ | 0.8426 | 0.94 | 6392 | 0.8421 |
143
+ | 0.8547 | 0.95 | 6460 | 0.8421 |
144
+ | 0.8144 | 0.96 | 6528 | 0.8419 |
145
+ | 0.8475 | 0.97 | 6596 | 0.8419 |
146
+ | 0.8063 | 0.98 | 6664 | 0.8419 |
147
+ | 0.7943 | 0.99 | 6732 | 0.8419 |
148
+
149
+
150
+ ### Framework versions
151
+
152
+ - Transformers 4.38.2
153
+ - Pytorch 2.1.0+cu121
154
+ - Datasets 2.18.0
155
+ - Tokenizers 0.15.2
config.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "EleutherAI/pythia-70m-deduped",
3
+ "architectures": [
4
+ "GPTNeoXForCausalLM"
5
+ ],
6
+ "attention_bias": true,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 0,
9
+ "classifier_dropout": 0.1,
10
+ "eos_token_id": 0,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout": 0.0,
13
+ "hidden_size": 512,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 2048,
16
+ "layer_norm_eps": 1e-05,
17
+ "max_position_embeddings": 2048,
18
+ "model_type": "gpt_neox",
19
+ "num_attention_heads": 8,
20
+ "num_hidden_layers": 6,
21
+ "rope_scaling": null,
22
+ "rotary_emb_base": 10000,
23
+ "rotary_pct": 0.25,
24
+ "tie_word_embeddings": false,
25
+ "torch_dtype": "float32",
26
+ "transformers_version": "4.38.2",
27
+ "use_cache": true,
28
+ "use_parallel_residual": true,
29
+ "vocab_size": 50304
30
+ }
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 0,
4
+ "eos_token_id": 0,
5
+ "transformers_version": "4.38.2"
6
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d2f17c06c4cb8c4daf6e8334ae2402f79d0aba79e190e0011b671a241f70746d
3
+ size 281715176
special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "<|endoftext|>",
17
+ "unk_token": {
18
+ "content": "<|endoftext|>",
19
+ "lstrip": false,
20
+ "normalized": false,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,212 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<|padding|>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "50254": {
21
+ "content": " ",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": false
27
+ },
28
+ "50255": {
29
+ "content": " ",
30
+ "lstrip": false,
31
+ "normalized": true,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": false
35
+ },
36
+ "50256": {
37
+ "content": " ",
38
+ "lstrip": false,
39
+ "normalized": true,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": false
43
+ },
44
+ "50257": {
45
+ "content": " ",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": false
51
+ },
52
+ "50258": {
53
+ "content": " ",
54
+ "lstrip": false,
55
+ "normalized": true,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": false
59
+ },
60
+ "50259": {
61
+ "content": " ",
62
+ "lstrip": false,
63
+ "normalized": true,
64
+ "rstrip": false,
65
+ "single_word": false,
66
+ "special": false
67
+ },
68
+ "50260": {
69
+ "content": " ",
70
+ "lstrip": false,
71
+ "normalized": true,
72
+ "rstrip": false,
73
+ "single_word": false,
74
+ "special": false
75
+ },
76
+ "50261": {
77
+ "content": " ",
78
+ "lstrip": false,
79
+ "normalized": true,
80
+ "rstrip": false,
81
+ "single_word": false,
82
+ "special": false
83
+ },
84
+ "50262": {
85
+ "content": " ",
86
+ "lstrip": false,
87
+ "normalized": true,
88
+ "rstrip": false,
89
+ "single_word": false,
90
+ "special": false
91
+ },
92
+ "50263": {
93
+ "content": " ",
94
+ "lstrip": false,
95
+ "normalized": true,
96
+ "rstrip": false,
97
+ "single_word": false,
98
+ "special": false
99
+ },
100
+ "50264": {
101
+ "content": " ",
102
+ "lstrip": false,
103
+ "normalized": true,
104
+ "rstrip": false,
105
+ "single_word": false,
106
+ "special": false
107
+ },
108
+ "50265": {
109
+ "content": " ",
110
+ "lstrip": false,
111
+ "normalized": true,
112
+ "rstrip": false,
113
+ "single_word": false,
114
+ "special": false
115
+ },
116
+ "50266": {
117
+ "content": " ",
118
+ "lstrip": false,
119
+ "normalized": true,
120
+ "rstrip": false,
121
+ "single_word": false,
122
+ "special": false
123
+ },
124
+ "50267": {
125
+ "content": " ",
126
+ "lstrip": false,
127
+ "normalized": true,
128
+ "rstrip": false,
129
+ "single_word": false,
130
+ "special": false
131
+ },
132
+ "50268": {
133
+ "content": " ",
134
+ "lstrip": false,
135
+ "normalized": true,
136
+ "rstrip": false,
137
+ "single_word": false,
138
+ "special": false
139
+ },
140
+ "50269": {
141
+ "content": " ",
142
+ "lstrip": false,
143
+ "normalized": true,
144
+ "rstrip": false,
145
+ "single_word": false,
146
+ "special": false
147
+ },
148
+ "50270": {
149
+ "content": " ",
150
+ "lstrip": false,
151
+ "normalized": true,
152
+ "rstrip": false,
153
+ "single_word": false,
154
+ "special": false
155
+ },
156
+ "50271": {
157
+ "content": " ",
158
+ "lstrip": false,
159
+ "normalized": true,
160
+ "rstrip": false,
161
+ "single_word": false,
162
+ "special": false
163
+ },
164
+ "50272": {
165
+ "content": " ",
166
+ "lstrip": false,
167
+ "normalized": true,
168
+ "rstrip": false,
169
+ "single_word": false,
170
+ "special": false
171
+ },
172
+ "50273": {
173
+ "content": " ",
174
+ "lstrip": false,
175
+ "normalized": true,
176
+ "rstrip": false,
177
+ "single_word": false,
178
+ "special": false
179
+ },
180
+ "50274": {
181
+ "content": " ",
182
+ "lstrip": false,
183
+ "normalized": true,
184
+ "rstrip": false,
185
+ "single_word": false,
186
+ "special": false
187
+ },
188
+ "50275": {
189
+ "content": " ",
190
+ "lstrip": false,
191
+ "normalized": true,
192
+ "rstrip": false,
193
+ "single_word": false,
194
+ "special": false
195
+ },
196
+ "50276": {
197
+ "content": " ",
198
+ "lstrip": false,
199
+ "normalized": true,
200
+ "rstrip": false,
201
+ "single_word": false,
202
+ "special": false
203
+ }
204
+ },
205
+ "bos_token": "<|endoftext|>",
206
+ "clean_up_tokenization_spaces": true,
207
+ "eos_token": "<|endoftext|>",
208
+ "model_max_length": 1000000000000000019884624838656,
209
+ "pad_token": "<|endoftext|>",
210
+ "tokenizer_class": "GPTNeoXTokenizer",
211
+ "unk_token": "<|endoftext|>"
212
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:51614726bab2e56e5dfc0f9f94dfb74daf5bf1e7311b0449c2b523ac762186ea
3
+ size 4856