File size: 25,558 Bytes
55e02dd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
2023-10-10 10:17:25,583 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,586 Model: "SequenceTagger(
  (embeddings): ByT5Embeddings(
    (model): T5EncoderModel(
      (shared): Embedding(384, 1472)
      (encoder): T5Stack(
        (embed_tokens): Embedding(384, 1472)
        (block): ModuleList(
          (0): T5Block(
            (layer): ModuleList(
              (0): T5LayerSelfAttention(
                (SelfAttention): T5Attention(
                  (q): Linear(in_features=1472, out_features=384, bias=False)
                  (k): Linear(in_features=1472, out_features=384, bias=False)
                  (v): Linear(in_features=1472, out_features=384, bias=False)
                  (o): Linear(in_features=384, out_features=1472, bias=False)
                  (relative_attention_bias): Embedding(32, 6)
                )
                (layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
                (dropout): Dropout(p=0.1, inplace=False)
              )
              (1): T5LayerFF(
                (DenseReluDense): T5DenseGatedActDense(
                  (wi_0): Linear(in_features=1472, out_features=3584, bias=False)
                  (wi_1): Linear(in_features=1472, out_features=3584, bias=False)
                  (wo): Linear(in_features=3584, out_features=1472, bias=False)
                  (dropout): Dropout(p=0.1, inplace=False)
                  (act): NewGELUActivation()
                )
                (layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
                (dropout): Dropout(p=0.1, inplace=False)
              )
            )
          )
          (1-11): 11 x T5Block(
            (layer): ModuleList(
              (0): T5LayerSelfAttention(
                (SelfAttention): T5Attention(
                  (q): Linear(in_features=1472, out_features=384, bias=False)
                  (k): Linear(in_features=1472, out_features=384, bias=False)
                  (v): Linear(in_features=1472, out_features=384, bias=False)
                  (o): Linear(in_features=384, out_features=1472, bias=False)
                )
                (layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
                (dropout): Dropout(p=0.1, inplace=False)
              )
              (1): T5LayerFF(
                (DenseReluDense): T5DenseGatedActDense(
                  (wi_0): Linear(in_features=1472, out_features=3584, bias=False)
                  (wi_1): Linear(in_features=1472, out_features=3584, bias=False)
                  (wo): Linear(in_features=3584, out_features=1472, bias=False)
                  (dropout): Dropout(p=0.1, inplace=False)
                  (act): NewGELUActivation()
                )
                (layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
                (dropout): Dropout(p=0.1, inplace=False)
              )
            )
          )
        )
        (final_layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
        (dropout): Dropout(p=0.1, inplace=False)
      )
    )
  )
  (locked_dropout): LockedDropout(p=0.5)
  (linear): Linear(in_features=1472, out_features=17, bias=True)
  (loss_function): CrossEntropyLoss()
)"
2023-10-10 10:17:25,586 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,587 MultiCorpus: 20847 train + 1123 dev + 3350 test sentences
 - NER_HIPE_2022 Corpus: 20847 train + 1123 dev + 3350 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/de/with_doc_seperator
2023-10-10 10:17:25,587 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,587 Train:  20847 sentences
2023-10-10 10:17:25,587         (train_with_dev=False, train_with_test=False)
2023-10-10 10:17:25,587 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,587 Training Params:
2023-10-10 10:17:25,587  - learning_rate: "0.00015" 
2023-10-10 10:17:25,587  - mini_batch_size: "8"
2023-10-10 10:17:25,587  - max_epochs: "10"
2023-10-10 10:17:25,587  - shuffle: "True"
2023-10-10 10:17:25,587 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,587 Plugins:
2023-10-10 10:17:25,588  - TensorboardLogger
2023-10-10 10:17:25,588  - LinearScheduler | warmup_fraction: '0.1'
2023-10-10 10:17:25,588 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,588 Final evaluation on model from best epoch (best-model.pt)
2023-10-10 10:17:25,588  - metric: "('micro avg', 'f1-score')"
2023-10-10 10:17:25,588 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,588 Computation:
2023-10-10 10:17:25,588  - compute on device: cuda:0
2023-10-10 10:17:25,588  - embedding storage: none
2023-10-10 10:17:25,588 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,588 Model training base path: "hmbench-newseye/de-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2"
2023-10-10 10:17:25,588 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,588 ----------------------------------------------------------------------------------------------------
2023-10-10 10:17:25,589 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-10 10:19:50,358 epoch 1 - iter 260/2606 - loss 2.82858377 - time (sec): 144.77 - samples/sec: 246.69 - lr: 0.000015 - momentum: 0.000000
2023-10-10 10:22:07,314 epoch 1 - iter 520/2606 - loss 2.59379089 - time (sec): 281.72 - samples/sec: 255.43 - lr: 0.000030 - momentum: 0.000000
2023-10-10 10:24:28,436 epoch 1 - iter 780/2606 - loss 2.18946733 - time (sec): 422.84 - samples/sec: 258.37 - lr: 0.000045 - momentum: 0.000000
2023-10-10 10:26:51,376 epoch 1 - iter 1040/2606 - loss 1.78347214 - time (sec): 565.78 - samples/sec: 262.46 - lr: 0.000060 - momentum: 0.000000
2023-10-10 10:29:09,559 epoch 1 - iter 1300/2606 - loss 1.51733285 - time (sec): 703.97 - samples/sec: 263.06 - lr: 0.000075 - momentum: 0.000000
2023-10-10 10:31:37,873 epoch 1 - iter 1560/2606 - loss 1.33965070 - time (sec): 852.28 - samples/sec: 261.90 - lr: 0.000090 - momentum: 0.000000
2023-10-10 10:33:58,911 epoch 1 - iter 1820/2606 - loss 1.20517898 - time (sec): 993.32 - samples/sec: 261.43 - lr: 0.000105 - momentum: 0.000000
2023-10-10 10:36:19,665 epoch 1 - iter 2080/2606 - loss 1.09396343 - time (sec): 1134.07 - samples/sec: 260.27 - lr: 0.000120 - momentum: 0.000000
2023-10-10 10:38:40,561 epoch 1 - iter 2340/2606 - loss 1.00927714 - time (sec): 1274.97 - samples/sec: 258.05 - lr: 0.000135 - momentum: 0.000000
2023-10-10 10:41:03,325 epoch 1 - iter 2600/2606 - loss 0.92825533 - time (sec): 1417.73 - samples/sec: 258.37 - lr: 0.000150 - momentum: 0.000000
2023-10-10 10:41:06,639 ----------------------------------------------------------------------------------------------------
2023-10-10 10:41:06,639 EPOCH 1 done: loss 0.9266 - lr: 0.000150
2023-10-10 10:41:45,319 DEV : loss 0.13984432816505432 - f1-score (micro avg)  0.2605
2023-10-10 10:41:45,373 saving best model
2023-10-10 10:41:46,388 ----------------------------------------------------------------------------------------------------
2023-10-10 10:44:03,782 epoch 2 - iter 260/2606 - loss 0.20048946 - time (sec): 137.39 - samples/sec: 254.87 - lr: 0.000148 - momentum: 0.000000
2023-10-10 10:46:21,256 epoch 2 - iter 520/2606 - loss 0.19474593 - time (sec): 274.86 - samples/sec: 257.90 - lr: 0.000147 - momentum: 0.000000
2023-10-10 10:48:42,738 epoch 2 - iter 780/2606 - loss 0.18435066 - time (sec): 416.35 - samples/sec: 258.36 - lr: 0.000145 - momentum: 0.000000
2023-10-10 10:51:06,768 epoch 2 - iter 1040/2606 - loss 0.17821735 - time (sec): 560.38 - samples/sec: 254.46 - lr: 0.000143 - momentum: 0.000000
2023-10-10 10:53:29,090 epoch 2 - iter 1300/2606 - loss 0.17222389 - time (sec): 702.70 - samples/sec: 255.82 - lr: 0.000142 - momentum: 0.000000
2023-10-10 10:55:53,869 epoch 2 - iter 1560/2606 - loss 0.16461177 - time (sec): 847.48 - samples/sec: 256.54 - lr: 0.000140 - momentum: 0.000000
2023-10-10 10:58:21,709 epoch 2 - iter 1820/2606 - loss 0.16129296 - time (sec): 995.32 - samples/sec: 256.21 - lr: 0.000138 - momentum: 0.000000
2023-10-10 11:00:47,042 epoch 2 - iter 2080/2606 - loss 0.15850392 - time (sec): 1140.65 - samples/sec: 254.82 - lr: 0.000137 - momentum: 0.000000
2023-10-10 11:03:09,654 epoch 2 - iter 2340/2606 - loss 0.15546215 - time (sec): 1283.26 - samples/sec: 254.16 - lr: 0.000135 - momentum: 0.000000
2023-10-10 11:05:34,501 epoch 2 - iter 2600/2606 - loss 0.15151244 - time (sec): 1428.11 - samples/sec: 256.78 - lr: 0.000133 - momentum: 0.000000
2023-10-10 11:05:37,581 ----------------------------------------------------------------------------------------------------
2023-10-10 11:05:37,582 EPOCH 2 done: loss 0.1514 - lr: 0.000133
2023-10-10 11:06:18,193 DEV : loss 0.12324459105730057 - f1-score (micro avg)  0.3194
2023-10-10 11:06:18,251 saving best model
2023-10-10 11:06:20,967 ----------------------------------------------------------------------------------------------------
2023-10-10 11:08:38,954 epoch 3 - iter 260/2606 - loss 0.08453196 - time (sec): 137.98 - samples/sec: 257.85 - lr: 0.000132 - momentum: 0.000000
2023-10-10 11:11:00,136 epoch 3 - iter 520/2606 - loss 0.08884281 - time (sec): 279.16 - samples/sec: 259.92 - lr: 0.000130 - momentum: 0.000000
2023-10-10 11:13:22,749 epoch 3 - iter 780/2606 - loss 0.09171291 - time (sec): 421.78 - samples/sec: 259.93 - lr: 0.000128 - momentum: 0.000000
2023-10-10 11:15:41,613 epoch 3 - iter 1040/2606 - loss 0.09501175 - time (sec): 560.64 - samples/sec: 259.52 - lr: 0.000127 - momentum: 0.000000
2023-10-10 11:18:01,854 epoch 3 - iter 1300/2606 - loss 0.09753731 - time (sec): 700.88 - samples/sec: 264.60 - lr: 0.000125 - momentum: 0.000000
2023-10-10 11:20:24,416 epoch 3 - iter 1560/2606 - loss 0.09540397 - time (sec): 843.45 - samples/sec: 264.50 - lr: 0.000123 - momentum: 0.000000
2023-10-10 11:22:44,671 epoch 3 - iter 1820/2606 - loss 0.09431741 - time (sec): 983.70 - samples/sec: 264.06 - lr: 0.000122 - momentum: 0.000000
2023-10-10 11:25:05,740 epoch 3 - iter 2080/2606 - loss 0.09273873 - time (sec): 1124.77 - samples/sec: 263.24 - lr: 0.000120 - momentum: 0.000000
2023-10-10 11:27:19,392 epoch 3 - iter 2340/2606 - loss 0.09140459 - time (sec): 1258.42 - samples/sec: 261.51 - lr: 0.000118 - momentum: 0.000000
2023-10-10 11:29:46,096 epoch 3 - iter 2600/2606 - loss 0.09148228 - time (sec): 1405.13 - samples/sec: 260.87 - lr: 0.000117 - momentum: 0.000000
2023-10-10 11:29:49,569 ----------------------------------------------------------------------------------------------------
2023-10-10 11:29:49,569 EPOCH 3 done: loss 0.0914 - lr: 0.000117
2023-10-10 11:30:33,329 DEV : loss 0.16454216837882996 - f1-score (micro avg)  0.375
2023-10-10 11:30:33,407 saving best model
2023-10-10 11:30:36,290 ----------------------------------------------------------------------------------------------------
2023-10-10 11:33:00,647 epoch 4 - iter 260/2606 - loss 0.05238673 - time (sec): 144.35 - samples/sec: 255.20 - lr: 0.000115 - momentum: 0.000000
2023-10-10 11:35:25,933 epoch 4 - iter 520/2606 - loss 0.06195508 - time (sec): 289.64 - samples/sec: 263.40 - lr: 0.000113 - momentum: 0.000000
2023-10-10 11:37:44,551 epoch 4 - iter 780/2606 - loss 0.06048133 - time (sec): 428.26 - samples/sec: 263.11 - lr: 0.000112 - momentum: 0.000000
2023-10-10 11:40:01,800 epoch 4 - iter 1040/2606 - loss 0.06150890 - time (sec): 565.51 - samples/sec: 262.28 - lr: 0.000110 - momentum: 0.000000
2023-10-10 11:42:24,722 epoch 4 - iter 1300/2606 - loss 0.05978791 - time (sec): 708.43 - samples/sec: 261.76 - lr: 0.000108 - momentum: 0.000000
2023-10-10 11:44:44,516 epoch 4 - iter 1560/2606 - loss 0.06398599 - time (sec): 848.22 - samples/sec: 261.95 - lr: 0.000107 - momentum: 0.000000
2023-10-10 11:47:03,146 epoch 4 - iter 1820/2606 - loss 0.06393966 - time (sec): 986.85 - samples/sec: 263.46 - lr: 0.000105 - momentum: 0.000000
2023-10-10 11:49:18,081 epoch 4 - iter 2080/2606 - loss 0.06373753 - time (sec): 1121.79 - samples/sec: 262.88 - lr: 0.000103 - momentum: 0.000000
2023-10-10 11:51:36,877 epoch 4 - iter 2340/2606 - loss 0.06554623 - time (sec): 1260.58 - samples/sec: 262.80 - lr: 0.000102 - momentum: 0.000000
2023-10-10 11:53:53,323 epoch 4 - iter 2600/2606 - loss 0.06560053 - time (sec): 1397.03 - samples/sec: 262.60 - lr: 0.000100 - momentum: 0.000000
2023-10-10 11:53:56,216 ----------------------------------------------------------------------------------------------------
2023-10-10 11:53:56,216 EPOCH 4 done: loss 0.0656 - lr: 0.000100
2023-10-10 11:54:36,789 DEV : loss 0.26484549045562744 - f1-score (micro avg)  0.3304
2023-10-10 11:54:36,848 ----------------------------------------------------------------------------------------------------
2023-10-10 11:56:53,975 epoch 5 - iter 260/2606 - loss 0.04025488 - time (sec): 137.12 - samples/sec: 250.50 - lr: 0.000098 - momentum: 0.000000
2023-10-10 11:59:13,175 epoch 5 - iter 520/2606 - loss 0.04564273 - time (sec): 276.32 - samples/sec: 254.62 - lr: 0.000097 - momentum: 0.000000
2023-10-10 12:01:31,458 epoch 5 - iter 780/2606 - loss 0.04191303 - time (sec): 414.61 - samples/sec: 261.94 - lr: 0.000095 - momentum: 0.000000
2023-10-10 12:03:47,948 epoch 5 - iter 1040/2606 - loss 0.04370480 - time (sec): 551.10 - samples/sec: 263.07 - lr: 0.000093 - momentum: 0.000000
2023-10-10 12:06:09,643 epoch 5 - iter 1300/2606 - loss 0.04555479 - time (sec): 692.79 - samples/sec: 263.14 - lr: 0.000092 - momentum: 0.000000
2023-10-10 12:08:27,869 epoch 5 - iter 1560/2606 - loss 0.04638323 - time (sec): 831.02 - samples/sec: 263.60 - lr: 0.000090 - momentum: 0.000000
2023-10-10 12:10:44,931 epoch 5 - iter 1820/2606 - loss 0.04605700 - time (sec): 968.08 - samples/sec: 262.91 - lr: 0.000088 - momentum: 0.000000
2023-10-10 12:13:06,008 epoch 5 - iter 2080/2606 - loss 0.04618736 - time (sec): 1109.16 - samples/sec: 264.32 - lr: 0.000087 - momentum: 0.000000
2023-10-10 12:15:28,125 epoch 5 - iter 2340/2606 - loss 0.04664304 - time (sec): 1251.27 - samples/sec: 264.95 - lr: 0.000085 - momentum: 0.000000
2023-10-10 12:17:46,232 epoch 5 - iter 2600/2606 - loss 0.04668505 - time (sec): 1389.38 - samples/sec: 263.95 - lr: 0.000083 - momentum: 0.000000
2023-10-10 12:17:49,292 ----------------------------------------------------------------------------------------------------
2023-10-10 12:17:49,292 EPOCH 5 done: loss 0.0467 - lr: 0.000083
2023-10-10 12:18:32,521 DEV : loss 0.29650354385375977 - f1-score (micro avg)  0.3622
2023-10-10 12:18:32,581 ----------------------------------------------------------------------------------------------------
2023-10-10 12:20:49,821 epoch 6 - iter 260/2606 - loss 0.03241520 - time (sec): 137.24 - samples/sec: 254.95 - lr: 0.000082 - momentum: 0.000000
2023-10-10 12:23:05,715 epoch 6 - iter 520/2606 - loss 0.02994548 - time (sec): 273.13 - samples/sec: 256.33 - lr: 0.000080 - momentum: 0.000000
2023-10-10 12:25:25,839 epoch 6 - iter 780/2606 - loss 0.03207625 - time (sec): 413.25 - samples/sec: 261.76 - lr: 0.000078 - momentum: 0.000000
2023-10-10 12:27:48,917 epoch 6 - iter 1040/2606 - loss 0.03228089 - time (sec): 556.33 - samples/sec: 263.64 - lr: 0.000077 - momentum: 0.000000
2023-10-10 12:30:11,973 epoch 6 - iter 1300/2606 - loss 0.03163962 - time (sec): 699.39 - samples/sec: 265.05 - lr: 0.000075 - momentum: 0.000000
2023-10-10 12:32:30,584 epoch 6 - iter 1560/2606 - loss 0.03281868 - time (sec): 838.00 - samples/sec: 263.89 - lr: 0.000073 - momentum: 0.000000
2023-10-10 12:34:50,290 epoch 6 - iter 1820/2606 - loss 0.03179360 - time (sec): 977.71 - samples/sec: 263.53 - lr: 0.000072 - momentum: 0.000000
2023-10-10 12:37:07,793 epoch 6 - iter 2080/2606 - loss 0.03208471 - time (sec): 1115.21 - samples/sec: 264.32 - lr: 0.000070 - momentum: 0.000000
2023-10-10 12:39:24,116 epoch 6 - iter 2340/2606 - loss 0.03214342 - time (sec): 1251.53 - samples/sec: 263.15 - lr: 0.000068 - momentum: 0.000000
2023-10-10 12:41:43,053 epoch 6 - iter 2600/2606 - loss 0.03446797 - time (sec): 1390.47 - samples/sec: 263.73 - lr: 0.000067 - momentum: 0.000000
2023-10-10 12:41:45,990 ----------------------------------------------------------------------------------------------------
2023-10-10 12:41:45,991 EPOCH 6 done: loss 0.0344 - lr: 0.000067
2023-10-10 12:42:28,894 DEV : loss 0.33373507857322693 - f1-score (micro avg)  0.378
2023-10-10 12:42:28,972 saving best model
2023-10-10 12:42:31,708 ----------------------------------------------------------------------------------------------------
2023-10-10 12:44:49,513 epoch 7 - iter 260/2606 - loss 0.02101641 - time (sec): 137.80 - samples/sec: 257.43 - lr: 0.000065 - momentum: 0.000000
2023-10-10 12:47:08,074 epoch 7 - iter 520/2606 - loss 0.01944662 - time (sec): 276.36 - samples/sec: 258.39 - lr: 0.000063 - momentum: 0.000000
2023-10-10 12:49:26,954 epoch 7 - iter 780/2606 - loss 0.02172094 - time (sec): 415.24 - samples/sec: 259.97 - lr: 0.000062 - momentum: 0.000000
2023-10-10 12:51:45,464 epoch 7 - iter 1040/2606 - loss 0.02238459 - time (sec): 553.75 - samples/sec: 260.69 - lr: 0.000060 - momentum: 0.000000
2023-10-10 12:54:07,890 epoch 7 - iter 1300/2606 - loss 0.02535700 - time (sec): 696.18 - samples/sec: 262.29 - lr: 0.000058 - momentum: 0.000000
2023-10-10 12:56:26,291 epoch 7 - iter 1560/2606 - loss 0.02596897 - time (sec): 834.58 - samples/sec: 262.39 - lr: 0.000057 - momentum: 0.000000
2023-10-10 12:58:45,212 epoch 7 - iter 1820/2606 - loss 0.02683384 - time (sec): 973.50 - samples/sec: 262.42 - lr: 0.000055 - momentum: 0.000000
2023-10-10 13:01:01,249 epoch 7 - iter 2080/2606 - loss 0.02612235 - time (sec): 1109.54 - samples/sec: 260.72 - lr: 0.000053 - momentum: 0.000000
2023-10-10 13:03:21,969 epoch 7 - iter 2340/2606 - loss 0.02627648 - time (sec): 1250.26 - samples/sec: 261.53 - lr: 0.000052 - momentum: 0.000000
2023-10-10 13:05:45,810 epoch 7 - iter 2600/2606 - loss 0.02563448 - time (sec): 1394.10 - samples/sec: 262.94 - lr: 0.000050 - momentum: 0.000000
2023-10-10 13:05:48,990 ----------------------------------------------------------------------------------------------------
2023-10-10 13:05:48,990 EPOCH 7 done: loss 0.0256 - lr: 0.000050
2023-10-10 13:06:31,953 DEV : loss 0.3978530466556549 - f1-score (micro avg)  0.3722
2023-10-10 13:06:32,018 ----------------------------------------------------------------------------------------------------
2023-10-10 13:08:55,328 epoch 8 - iter 260/2606 - loss 0.01412281 - time (sec): 143.31 - samples/sec: 277.93 - lr: 0.000048 - momentum: 0.000000
2023-10-10 13:11:15,605 epoch 8 - iter 520/2606 - loss 0.01664019 - time (sec): 283.58 - samples/sec: 270.88 - lr: 0.000047 - momentum: 0.000000
2023-10-10 13:13:32,923 epoch 8 - iter 780/2606 - loss 0.01697467 - time (sec): 420.90 - samples/sec: 266.21 - lr: 0.000045 - momentum: 0.000000
2023-10-10 13:15:54,238 epoch 8 - iter 1040/2606 - loss 0.01697407 - time (sec): 562.22 - samples/sec: 265.97 - lr: 0.000043 - momentum: 0.000000
2023-10-10 13:18:13,799 epoch 8 - iter 1300/2606 - loss 0.01628044 - time (sec): 701.78 - samples/sec: 263.85 - lr: 0.000042 - momentum: 0.000000
2023-10-10 13:20:35,746 epoch 8 - iter 1560/2606 - loss 0.01692923 - time (sec): 843.73 - samples/sec: 263.69 - lr: 0.000040 - momentum: 0.000000
2023-10-10 13:22:55,645 epoch 8 - iter 1820/2606 - loss 0.01698632 - time (sec): 983.62 - samples/sec: 263.30 - lr: 0.000038 - momentum: 0.000000
2023-10-10 13:25:13,795 epoch 8 - iter 2080/2606 - loss 0.01667188 - time (sec): 1121.77 - samples/sec: 261.61 - lr: 0.000037 - momentum: 0.000000
2023-10-10 13:27:29,647 epoch 8 - iter 2340/2606 - loss 0.01736529 - time (sec): 1257.63 - samples/sec: 260.13 - lr: 0.000035 - momentum: 0.000000
2023-10-10 13:29:50,730 epoch 8 - iter 2600/2606 - loss 0.01789555 - time (sec): 1398.71 - samples/sec: 262.23 - lr: 0.000033 - momentum: 0.000000
2023-10-10 13:29:53,639 ----------------------------------------------------------------------------------------------------
2023-10-10 13:29:53,640 EPOCH 8 done: loss 0.0179 - lr: 0.000033
2023-10-10 13:30:36,471 DEV : loss 0.40461236238479614 - f1-score (micro avg)  0.3935
2023-10-10 13:30:36,533 saving best model
2023-10-10 13:30:39,231 ----------------------------------------------------------------------------------------------------
2023-10-10 13:32:59,878 epoch 9 - iter 260/2606 - loss 0.01216931 - time (sec): 140.64 - samples/sec: 263.28 - lr: 0.000032 - momentum: 0.000000
2023-10-10 13:35:25,145 epoch 9 - iter 520/2606 - loss 0.01085860 - time (sec): 285.91 - samples/sec: 270.57 - lr: 0.000030 - momentum: 0.000000
2023-10-10 13:37:44,303 epoch 9 - iter 780/2606 - loss 0.01126911 - time (sec): 425.07 - samples/sec: 266.36 - lr: 0.000028 - momentum: 0.000000
2023-10-10 13:39:59,037 epoch 9 - iter 1040/2606 - loss 0.01140504 - time (sec): 559.80 - samples/sec: 261.97 - lr: 0.000027 - momentum: 0.000000
2023-10-10 13:42:20,737 epoch 9 - iter 1300/2606 - loss 0.01208397 - time (sec): 701.50 - samples/sec: 264.42 - lr: 0.000025 - momentum: 0.000000
2023-10-10 13:44:42,081 epoch 9 - iter 1560/2606 - loss 0.01245932 - time (sec): 842.85 - samples/sec: 261.13 - lr: 0.000023 - momentum: 0.000000
2023-10-10 13:46:58,383 epoch 9 - iter 1820/2606 - loss 0.01254583 - time (sec): 979.15 - samples/sec: 260.67 - lr: 0.000022 - momentum: 0.000000
2023-10-10 13:49:17,990 epoch 9 - iter 2080/2606 - loss 0.01291335 - time (sec): 1118.75 - samples/sec: 260.52 - lr: 0.000020 - momentum: 0.000000
2023-10-10 13:51:38,722 epoch 9 - iter 2340/2606 - loss 0.01272261 - time (sec): 1259.49 - samples/sec: 260.05 - lr: 0.000018 - momentum: 0.000000
2023-10-10 13:53:58,619 epoch 9 - iter 2600/2606 - loss 0.01295044 - time (sec): 1399.38 - samples/sec: 262.18 - lr: 0.000017 - momentum: 0.000000
2023-10-10 13:54:01,480 ----------------------------------------------------------------------------------------------------
2023-10-10 13:54:01,480 EPOCH 9 done: loss 0.0129 - lr: 0.000017
2023-10-10 13:54:42,752 DEV : loss 0.4415739178657532 - f1-score (micro avg)  0.3754
2023-10-10 13:54:42,814 ----------------------------------------------------------------------------------------------------
2023-10-10 13:57:04,272 epoch 10 - iter 260/2606 - loss 0.00790204 - time (sec): 141.46 - samples/sec: 270.22 - lr: 0.000015 - momentum: 0.000000
2023-10-10 13:59:26,749 epoch 10 - iter 520/2606 - loss 0.00758597 - time (sec): 283.93 - samples/sec: 270.20 - lr: 0.000013 - momentum: 0.000000
2023-10-10 14:01:45,507 epoch 10 - iter 780/2606 - loss 0.00856811 - time (sec): 422.69 - samples/sec: 266.23 - lr: 0.000012 - momentum: 0.000000
2023-10-10 14:04:03,408 epoch 10 - iter 1040/2606 - loss 0.00841758 - time (sec): 560.59 - samples/sec: 259.01 - lr: 0.000010 - momentum: 0.000000
2023-10-10 14:06:23,118 epoch 10 - iter 1300/2606 - loss 0.00819062 - time (sec): 700.30 - samples/sec: 260.67 - lr: 0.000008 - momentum: 0.000000
2023-10-10 14:08:41,954 epoch 10 - iter 1560/2606 - loss 0.00817874 - time (sec): 839.14 - samples/sec: 260.44 - lr: 0.000007 - momentum: 0.000000
2023-10-10 14:11:00,012 epoch 10 - iter 1820/2606 - loss 0.00826103 - time (sec): 977.20 - samples/sec: 261.61 - lr: 0.000005 - momentum: 0.000000
2023-10-10 14:13:22,211 epoch 10 - iter 2080/2606 - loss 0.00858032 - time (sec): 1119.40 - samples/sec: 263.35 - lr: 0.000003 - momentum: 0.000000
2023-10-10 14:15:40,101 epoch 10 - iter 2340/2606 - loss 0.00841120 - time (sec): 1257.28 - samples/sec: 262.99 - lr: 0.000002 - momentum: 0.000000
2023-10-10 14:17:58,169 epoch 10 - iter 2600/2606 - loss 0.00852789 - time (sec): 1395.35 - samples/sec: 262.65 - lr: 0.000000 - momentum: 0.000000
2023-10-10 14:18:01,398 ----------------------------------------------------------------------------------------------------
2023-10-10 14:18:01,398 EPOCH 10 done: loss 0.0085 - lr: 0.000000
2023-10-10 14:18:43,420 DEV : loss 0.46130311489105225 - f1-score (micro avg)  0.3804
2023-10-10 14:18:44,392 ----------------------------------------------------------------------------------------------------
2023-10-10 14:18:44,394 Loading model from best epoch ...
2023-10-10 14:18:48,948 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd
2023-10-10 14:20:38,998 
Results:
- F-score (micro) 0.4462
- F-score (macro) 0.3145
- Accuracy 0.2916

By class:
              precision    recall  f1-score   support

         LOC     0.4555    0.5231    0.4870      1214
         PER     0.4229    0.4517    0.4369       808
         ORG     0.3343    0.3343    0.3343       353
   HumanProd     0.0000    0.0000    0.0000        15

   micro avg     0.4266    0.4678    0.4462      2390
   macro avg     0.3032    0.3273    0.3145      2390
weighted avg     0.4237    0.4678    0.4444      2390

2023-10-10 14:20:38,999 ----------------------------------------------------------------------------------------------------