|
2023-09-04 11:16:25,341 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,342 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(32001, 768) |
|
(position_embeddings): Embedding(512, 768) |
|
(token_type_embeddings): Embedding(2, 768) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-11): 12 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=768, out_features=768, bias=True) |
|
(key): Linear(in_features=768, out_features=768, bias=True) |
|
(value): Linear(in_features=768, out_features=768, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=768, out_features=3072, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=3072, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=768, out_features=21, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-09-04 11:16:25,342 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,342 MultiCorpus: 5901 train + 1287 dev + 1505 test sentences |
|
- NER_HIPE_2022 Corpus: 5901 train + 1287 dev + 1505 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/fr/with_doc_seperator |
|
2023-09-04 11:16:25,342 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,342 Train: 5901 sentences |
|
2023-09-04 11:16:25,342 (train_with_dev=False, train_with_test=False) |
|
2023-09-04 11:16:25,342 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,342 Training Params: |
|
2023-09-04 11:16:25,342 - learning_rate: "3e-05" |
|
2023-09-04 11:16:25,343 - mini_batch_size: "8" |
|
2023-09-04 11:16:25,343 - max_epochs: "10" |
|
2023-09-04 11:16:25,343 - shuffle: "True" |
|
2023-09-04 11:16:25,343 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,343 Plugins: |
|
2023-09-04 11:16:25,343 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-09-04 11:16:25,343 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,343 Final evaluation on model from best epoch (best-model.pt) |
|
2023-09-04 11:16:25,343 - metric: "('micro avg', 'f1-score')" |
|
2023-09-04 11:16:25,343 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,343 Computation: |
|
2023-09-04 11:16:25,343 - compute on device: cuda:0 |
|
2023-09-04 11:16:25,343 - embedding storage: none |
|
2023-09-04 11:16:25,343 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,343 Model training base path: "hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2" |
|
2023-09-04 11:16:25,343 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:25,343 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:16:40,182 epoch 1 - iter 73/738 - loss 3.06648327 - time (sec): 14.84 - samples/sec: 1185.94 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-04 11:16:54,521 epoch 1 - iter 146/738 - loss 2.03380649 - time (sec): 29.18 - samples/sec: 1226.77 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-04 11:17:08,295 epoch 1 - iter 219/738 - loss 1.55275994 - time (sec): 42.95 - samples/sec: 1203.95 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-04 11:17:22,200 epoch 1 - iter 292/738 - loss 1.26157976 - time (sec): 56.86 - samples/sec: 1201.71 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-04 11:17:36,211 epoch 1 - iter 365/738 - loss 1.08469915 - time (sec): 70.87 - samples/sec: 1199.52 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-04 11:17:49,770 epoch 1 - iter 438/738 - loss 0.95415774 - time (sec): 84.43 - samples/sec: 1204.60 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-04 11:18:03,731 epoch 1 - iter 511/738 - loss 0.85774269 - time (sec): 98.39 - samples/sec: 1197.64 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-04 11:18:15,906 epoch 1 - iter 584/738 - loss 0.79054942 - time (sec): 110.56 - samples/sec: 1197.35 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-04 11:18:29,496 epoch 1 - iter 657/738 - loss 0.72897502 - time (sec): 124.15 - samples/sec: 1196.74 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-04 11:18:42,826 epoch 1 - iter 730/738 - loss 0.67545291 - time (sec): 137.48 - samples/sec: 1199.69 - lr: 0.000030 - momentum: 0.000000 |
|
2023-09-04 11:18:44,111 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:18:44,111 EPOCH 1 done: loss 0.6710 - lr: 0.000030 |
|
2023-09-04 11:18:57,888 DEV : loss 0.1332985907793045 - f1-score (micro avg) 0.711 |
|
2023-09-04 11:18:57,916 saving best model |
|
2023-09-04 11:18:58,390 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:19:09,963 epoch 2 - iter 73/738 - loss 0.14414587 - time (sec): 11.57 - samples/sec: 1306.62 - lr: 0.000030 - momentum: 0.000000 |
|
2023-09-04 11:19:23,202 epoch 2 - iter 146/738 - loss 0.14416364 - time (sec): 24.81 - samples/sec: 1262.22 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-04 11:19:36,336 epoch 2 - iter 219/738 - loss 0.14447359 - time (sec): 37.94 - samples/sec: 1250.81 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-04 11:19:49,994 epoch 2 - iter 292/738 - loss 0.13943446 - time (sec): 51.60 - samples/sec: 1223.27 - lr: 0.000029 - momentum: 0.000000 |
|
2023-09-04 11:20:02,942 epoch 2 - iter 365/738 - loss 0.14063830 - time (sec): 64.55 - samples/sec: 1217.88 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-04 11:20:17,121 epoch 2 - iter 438/738 - loss 0.13642158 - time (sec): 78.73 - samples/sec: 1212.44 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-04 11:20:32,838 epoch 2 - iter 511/738 - loss 0.13356226 - time (sec): 94.45 - samples/sec: 1205.26 - lr: 0.000028 - momentum: 0.000000 |
|
2023-09-04 11:20:46,437 epoch 2 - iter 584/738 - loss 0.12802162 - time (sec): 108.05 - samples/sec: 1204.42 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-04 11:21:00,585 epoch 2 - iter 657/738 - loss 0.12895343 - time (sec): 122.19 - samples/sec: 1205.41 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-04 11:21:15,845 epoch 2 - iter 730/738 - loss 0.12767766 - time (sec): 137.45 - samples/sec: 1198.38 - lr: 0.000027 - momentum: 0.000000 |
|
2023-09-04 11:21:17,240 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:21:17,241 EPOCH 2 done: loss 0.1275 - lr: 0.000027 |
|
2023-09-04 11:21:35,278 DEV : loss 0.10566549748182297 - f1-score (micro avg) 0.7649 |
|
2023-09-04 11:21:35,307 saving best model |
|
2023-09-04 11:21:37,107 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:21:50,327 epoch 3 - iter 73/738 - loss 0.06542645 - time (sec): 13.22 - samples/sec: 1167.53 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-04 11:22:03,772 epoch 3 - iter 146/738 - loss 0.07399537 - time (sec): 26.66 - samples/sec: 1202.78 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-04 11:22:17,525 epoch 3 - iter 219/738 - loss 0.07697646 - time (sec): 40.42 - samples/sec: 1196.14 - lr: 0.000026 - momentum: 0.000000 |
|
2023-09-04 11:22:29,378 epoch 3 - iter 292/738 - loss 0.07623798 - time (sec): 52.27 - samples/sec: 1210.85 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-04 11:22:45,122 epoch 3 - iter 365/738 - loss 0.07410337 - time (sec): 68.01 - samples/sec: 1190.47 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-04 11:23:00,135 epoch 3 - iter 438/738 - loss 0.07256508 - time (sec): 83.03 - samples/sec: 1197.06 - lr: 0.000025 - momentum: 0.000000 |
|
2023-09-04 11:23:13,662 epoch 3 - iter 511/738 - loss 0.07021164 - time (sec): 96.55 - samples/sec: 1195.05 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-04 11:23:27,456 epoch 3 - iter 584/738 - loss 0.07127441 - time (sec): 110.35 - samples/sec: 1198.87 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-04 11:23:41,892 epoch 3 - iter 657/738 - loss 0.07075954 - time (sec): 124.78 - samples/sec: 1197.07 - lr: 0.000024 - momentum: 0.000000 |
|
2023-09-04 11:23:54,828 epoch 3 - iter 730/738 - loss 0.07175536 - time (sec): 137.72 - samples/sec: 1196.62 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-04 11:23:56,063 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:23:56,063 EPOCH 3 done: loss 0.0717 - lr: 0.000023 |
|
2023-09-04 11:24:13,715 DEV : loss 0.10343769192695618 - f1-score (micro avg) 0.8241 |
|
2023-09-04 11:24:13,745 saving best model |
|
2023-09-04 11:24:15,083 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:24:28,030 epoch 4 - iter 73/738 - loss 0.03831422 - time (sec): 12.94 - samples/sec: 1169.21 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-04 11:24:40,926 epoch 4 - iter 146/738 - loss 0.04482622 - time (sec): 25.84 - samples/sec: 1192.96 - lr: 0.000023 - momentum: 0.000000 |
|
2023-09-04 11:24:54,494 epoch 4 - iter 219/738 - loss 0.04752270 - time (sec): 39.41 - samples/sec: 1201.44 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-04 11:25:07,213 epoch 4 - iter 292/738 - loss 0.04665692 - time (sec): 52.13 - samples/sec: 1204.30 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-04 11:25:21,351 epoch 4 - iter 365/738 - loss 0.04665212 - time (sec): 66.27 - samples/sec: 1199.76 - lr: 0.000022 - momentum: 0.000000 |
|
2023-09-04 11:25:36,569 epoch 4 - iter 438/738 - loss 0.04711492 - time (sec): 81.48 - samples/sec: 1188.35 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-04 11:25:52,845 epoch 4 - iter 511/738 - loss 0.04622890 - time (sec): 97.76 - samples/sec: 1181.64 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-04 11:26:05,967 epoch 4 - iter 584/738 - loss 0.04565772 - time (sec): 110.88 - samples/sec: 1190.01 - lr: 0.000021 - momentum: 0.000000 |
|
2023-09-04 11:26:20,323 epoch 4 - iter 657/738 - loss 0.04774170 - time (sec): 125.24 - samples/sec: 1187.85 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-04 11:26:33,220 epoch 4 - iter 730/738 - loss 0.04754700 - time (sec): 138.14 - samples/sec: 1193.00 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-04 11:26:34,562 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:26:34,562 EPOCH 4 done: loss 0.0476 - lr: 0.000020 |
|
2023-09-04 11:26:52,250 DEV : loss 0.1501348614692688 - f1-score (micro avg) 0.8197 |
|
2023-09-04 11:26:52,279 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:27:05,999 epoch 5 - iter 73/738 - loss 0.04082482 - time (sec): 13.72 - samples/sec: 1199.10 - lr: 0.000020 - momentum: 0.000000 |
|
2023-09-04 11:27:20,024 epoch 5 - iter 146/738 - loss 0.03463403 - time (sec): 27.74 - samples/sec: 1190.19 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-04 11:27:33,408 epoch 5 - iter 219/738 - loss 0.03390961 - time (sec): 41.13 - samples/sec: 1214.22 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-04 11:27:46,798 epoch 5 - iter 292/738 - loss 0.02989132 - time (sec): 54.52 - samples/sec: 1211.07 - lr: 0.000019 - momentum: 0.000000 |
|
2023-09-04 11:28:00,272 epoch 5 - iter 365/738 - loss 0.03275215 - time (sec): 67.99 - samples/sec: 1207.28 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-04 11:28:13,250 epoch 5 - iter 438/738 - loss 0.03347779 - time (sec): 80.97 - samples/sec: 1204.45 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-04 11:28:27,107 epoch 5 - iter 511/738 - loss 0.03299783 - time (sec): 94.83 - samples/sec: 1199.22 - lr: 0.000018 - momentum: 0.000000 |
|
2023-09-04 11:28:42,630 epoch 5 - iter 584/738 - loss 0.03455838 - time (sec): 110.35 - samples/sec: 1192.27 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-04 11:28:58,406 epoch 5 - iter 657/738 - loss 0.03460168 - time (sec): 126.13 - samples/sec: 1187.75 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-04 11:29:10,450 epoch 5 - iter 730/738 - loss 0.03584338 - time (sec): 138.17 - samples/sec: 1191.65 - lr: 0.000017 - momentum: 0.000000 |
|
2023-09-04 11:29:12,019 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:29:12,019 EPOCH 5 done: loss 0.0358 - lr: 0.000017 |
|
2023-09-04 11:29:29,908 DEV : loss 0.16683056950569153 - f1-score (micro avg) 0.8109 |
|
2023-09-04 11:29:29,937 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:29:42,285 epoch 6 - iter 73/738 - loss 0.02593474 - time (sec): 12.35 - samples/sec: 1201.98 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-04 11:29:58,445 epoch 6 - iter 146/738 - loss 0.02883744 - time (sec): 28.51 - samples/sec: 1190.61 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-04 11:30:12,484 epoch 6 - iter 219/738 - loss 0.02750899 - time (sec): 42.55 - samples/sec: 1194.32 - lr: 0.000016 - momentum: 0.000000 |
|
2023-09-04 11:30:25,524 epoch 6 - iter 292/738 - loss 0.02938105 - time (sec): 55.59 - samples/sec: 1192.04 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-04 11:30:41,579 epoch 6 - iter 365/738 - loss 0.02857489 - time (sec): 71.64 - samples/sec: 1178.72 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-04 11:30:55,316 epoch 6 - iter 438/738 - loss 0.02981127 - time (sec): 85.38 - samples/sec: 1186.79 - lr: 0.000015 - momentum: 0.000000 |
|
2023-09-04 11:31:07,585 epoch 6 - iter 511/738 - loss 0.02817810 - time (sec): 97.65 - samples/sec: 1194.75 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-04 11:31:21,427 epoch 6 - iter 584/738 - loss 0.02650218 - time (sec): 111.49 - samples/sec: 1194.80 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-04 11:31:34,788 epoch 6 - iter 657/738 - loss 0.02642219 - time (sec): 124.85 - samples/sec: 1192.33 - lr: 0.000014 - momentum: 0.000000 |
|
2023-09-04 11:31:48,464 epoch 6 - iter 730/738 - loss 0.02616732 - time (sec): 138.53 - samples/sec: 1189.72 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-04 11:31:49,692 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:31:49,693 EPOCH 6 done: loss 0.0262 - lr: 0.000013 |
|
2023-09-04 11:32:07,584 DEV : loss 0.18466810882091522 - f1-score (micro avg) 0.8091 |
|
2023-09-04 11:32:07,614 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:32:23,652 epoch 7 - iter 73/738 - loss 0.01307957 - time (sec): 16.04 - samples/sec: 1061.53 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-04 11:32:35,332 epoch 7 - iter 146/738 - loss 0.01247271 - time (sec): 27.72 - samples/sec: 1148.57 - lr: 0.000013 - momentum: 0.000000 |
|
2023-09-04 11:32:51,009 epoch 7 - iter 219/738 - loss 0.01585589 - time (sec): 43.39 - samples/sec: 1160.31 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-04 11:33:06,749 epoch 7 - iter 292/738 - loss 0.01732056 - time (sec): 59.13 - samples/sec: 1164.59 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-04 11:33:18,946 epoch 7 - iter 365/738 - loss 0.01721112 - time (sec): 71.33 - samples/sec: 1173.19 - lr: 0.000012 - momentum: 0.000000 |
|
2023-09-04 11:33:31,567 epoch 7 - iter 438/738 - loss 0.01753197 - time (sec): 83.95 - samples/sec: 1179.95 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-04 11:33:44,306 epoch 7 - iter 511/738 - loss 0.01824019 - time (sec): 96.69 - samples/sec: 1189.86 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-04 11:33:57,143 epoch 7 - iter 584/738 - loss 0.01854363 - time (sec): 109.53 - samples/sec: 1191.38 - lr: 0.000011 - momentum: 0.000000 |
|
2023-09-04 11:34:11,077 epoch 7 - iter 657/738 - loss 0.01811224 - time (sec): 123.46 - samples/sec: 1186.22 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-04 11:34:26,868 epoch 7 - iter 730/738 - loss 0.01862229 - time (sec): 139.25 - samples/sec: 1183.77 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-04 11:34:28,070 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:34:28,070 EPOCH 7 done: loss 0.0186 - lr: 0.000010 |
|
2023-09-04 11:34:45,827 DEV : loss 0.18642598390579224 - f1-score (micro avg) 0.8208 |
|
2023-09-04 11:34:45,857 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:34:59,191 epoch 8 - iter 73/738 - loss 0.01522610 - time (sec): 13.33 - samples/sec: 1251.51 - lr: 0.000010 - momentum: 0.000000 |
|
2023-09-04 11:35:14,095 epoch 8 - iter 146/738 - loss 0.01519085 - time (sec): 28.24 - samples/sec: 1191.42 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-04 11:35:30,259 epoch 8 - iter 219/738 - loss 0.01828244 - time (sec): 44.40 - samples/sec: 1189.71 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-04 11:35:43,080 epoch 8 - iter 292/738 - loss 0.01832505 - time (sec): 57.22 - samples/sec: 1182.77 - lr: 0.000009 - momentum: 0.000000 |
|
2023-09-04 11:35:55,402 epoch 8 - iter 365/738 - loss 0.01661591 - time (sec): 69.54 - samples/sec: 1184.72 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-04 11:36:08,970 epoch 8 - iter 438/738 - loss 0.01667432 - time (sec): 83.11 - samples/sec: 1185.80 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-04 11:36:22,551 epoch 8 - iter 511/738 - loss 0.01570984 - time (sec): 96.69 - samples/sec: 1186.58 - lr: 0.000008 - momentum: 0.000000 |
|
2023-09-04 11:36:34,441 epoch 8 - iter 584/738 - loss 0.01509001 - time (sec): 108.58 - samples/sec: 1191.30 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-04 11:36:47,899 epoch 8 - iter 657/738 - loss 0.01454722 - time (sec): 122.04 - samples/sec: 1190.17 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-04 11:37:03,823 epoch 8 - iter 730/738 - loss 0.01433091 - time (sec): 137.97 - samples/sec: 1193.89 - lr: 0.000007 - momentum: 0.000000 |
|
2023-09-04 11:37:05,182 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:37:05,183 EPOCH 8 done: loss 0.0142 - lr: 0.000007 |
|
2023-09-04 11:37:22,950 DEV : loss 0.18818210065364838 - f1-score (micro avg) 0.8326 |
|
2023-09-04 11:37:22,979 saving best model |
|
2023-09-04 11:37:24,373 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:37:37,826 epoch 9 - iter 73/738 - loss 0.00383387 - time (sec): 13.45 - samples/sec: 1152.61 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-04 11:37:52,233 epoch 9 - iter 146/738 - loss 0.00624195 - time (sec): 27.86 - samples/sec: 1160.13 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-04 11:38:04,879 epoch 9 - iter 219/738 - loss 0.00642501 - time (sec): 40.50 - samples/sec: 1192.56 - lr: 0.000006 - momentum: 0.000000 |
|
2023-09-04 11:38:18,603 epoch 9 - iter 292/738 - loss 0.00735751 - time (sec): 54.23 - samples/sec: 1195.23 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-04 11:38:33,870 epoch 9 - iter 365/738 - loss 0.00866516 - time (sec): 69.49 - samples/sec: 1191.19 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-04 11:38:47,019 epoch 9 - iter 438/738 - loss 0.00809663 - time (sec): 82.64 - samples/sec: 1189.03 - lr: 0.000005 - momentum: 0.000000 |
|
2023-09-04 11:39:01,987 epoch 9 - iter 511/738 - loss 0.00809937 - time (sec): 97.61 - samples/sec: 1180.04 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-04 11:39:14,903 epoch 9 - iter 584/738 - loss 0.00805249 - time (sec): 110.53 - samples/sec: 1178.10 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-04 11:39:27,998 epoch 9 - iter 657/738 - loss 0.00768456 - time (sec): 123.62 - samples/sec: 1186.33 - lr: 0.000004 - momentum: 0.000000 |
|
2023-09-04 11:39:43,357 epoch 9 - iter 730/738 - loss 0.00940618 - time (sec): 138.98 - samples/sec: 1184.47 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-04 11:39:45,117 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:39:45,118 EPOCH 9 done: loss 0.0097 - lr: 0.000003 |
|
2023-09-04 11:40:02,942 DEV : loss 0.19703754782676697 - f1-score (micro avg) 0.8283 |
|
2023-09-04 11:40:02,971 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:40:17,925 epoch 10 - iter 73/738 - loss 0.00771900 - time (sec): 14.95 - samples/sec: 1177.95 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-04 11:40:30,928 epoch 10 - iter 146/738 - loss 0.00713738 - time (sec): 27.95 - samples/sec: 1199.04 - lr: 0.000003 - momentum: 0.000000 |
|
2023-09-04 11:40:42,538 epoch 10 - iter 219/738 - loss 0.00787082 - time (sec): 39.56 - samples/sec: 1237.61 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-04 11:40:56,818 epoch 10 - iter 292/738 - loss 0.00762140 - time (sec): 53.84 - samples/sec: 1212.75 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-04 11:41:10,798 epoch 10 - iter 365/738 - loss 0.00753159 - time (sec): 67.83 - samples/sec: 1199.26 - lr: 0.000002 - momentum: 0.000000 |
|
2023-09-04 11:41:26,672 epoch 10 - iter 438/738 - loss 0.00786862 - time (sec): 83.70 - samples/sec: 1195.58 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-04 11:41:39,151 epoch 10 - iter 511/738 - loss 0.00769068 - time (sec): 96.18 - samples/sec: 1193.13 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-04 11:41:54,301 epoch 10 - iter 584/738 - loss 0.00768488 - time (sec): 111.33 - samples/sec: 1183.58 - lr: 0.000001 - momentum: 0.000000 |
|
2023-09-04 11:42:08,346 epoch 10 - iter 657/738 - loss 0.00813962 - time (sec): 125.37 - samples/sec: 1182.65 - lr: 0.000000 - momentum: 0.000000 |
|
2023-09-04 11:42:22,788 epoch 10 - iter 730/738 - loss 0.00764454 - time (sec): 139.82 - samples/sec: 1179.55 - lr: 0.000000 - momentum: 0.000000 |
|
2023-09-04 11:42:23,896 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:42:23,897 EPOCH 10 done: loss 0.0076 - lr: 0.000000 |
|
2023-09-04 11:42:41,733 DEV : loss 0.20217673480510712 - f1-score (micro avg) 0.8313 |
|
2023-09-04 11:42:42,259 ---------------------------------------------------------------------------------------------------- |
|
2023-09-04 11:42:42,260 Loading model from best epoch ... |
|
2023-09-04 11:42:44,189 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-time, B-time, E-time, I-time, S-prod, B-prod, E-prod, I-prod |
|
2023-09-04 11:42:58,808 |
|
Results: |
|
- F-score (micro) 0.7992 |
|
- F-score (macro) 0.6961 |
|
- Accuracy 0.6896 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
loc 0.8835 0.8753 0.8794 858 |
|
pers 0.7526 0.8045 0.7777 537 |
|
org 0.4934 0.5682 0.5282 132 |
|
time 0.5303 0.6481 0.5833 54 |
|
prod 0.7368 0.6885 0.7119 61 |
|
|
|
micro avg 0.7858 0.8130 0.7992 1642 |
|
macro avg 0.6793 0.7169 0.6961 1642 |
|
weighted avg 0.7923 0.8130 0.8019 1642 |
|
|
|
2023-09-04 11:42:58,808 ---------------------------------------------------------------------------------------------------- |
|
|