|
2023-10-06 15:36:54,881 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,882 Model: "SequenceTagger( |
|
(embeddings): ByT5Embeddings( |
|
(model): T5EncoderModel( |
|
(shared): Embedding(384, 1472) |
|
(encoder): T5Stack( |
|
(embed_tokens): Embedding(384, 1472) |
|
(block): ModuleList( |
|
(0): T5Block( |
|
(layer): ModuleList( |
|
(0): T5LayerSelfAttention( |
|
(SelfAttention): T5Attention( |
|
(q): Linear(in_features=1472, out_features=384, bias=False) |
|
(k): Linear(in_features=1472, out_features=384, bias=False) |
|
(v): Linear(in_features=1472, out_features=384, bias=False) |
|
(o): Linear(in_features=384, out_features=1472, bias=False) |
|
(relative_attention_bias): Embedding(32, 6) |
|
) |
|
(layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(1): T5LayerFF( |
|
(DenseReluDense): T5DenseGatedActDense( |
|
(wi_0): Linear(in_features=1472, out_features=3584, bias=False) |
|
(wi_1): Linear(in_features=1472, out_features=3584, bias=False) |
|
(wo): Linear(in_features=3584, out_features=1472, bias=False) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
(act): NewGELUActivation() |
|
) |
|
(layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
(1-11): 11 x T5Block( |
|
(layer): ModuleList( |
|
(0): T5LayerSelfAttention( |
|
(SelfAttention): T5Attention( |
|
(q): Linear(in_features=1472, out_features=384, bias=False) |
|
(k): Linear(in_features=1472, out_features=384, bias=False) |
|
(v): Linear(in_features=1472, out_features=384, bias=False) |
|
(o): Linear(in_features=384, out_features=1472, bias=False) |
|
) |
|
(layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(1): T5LayerFF( |
|
(DenseReluDense): T5DenseGatedActDense( |
|
(wi_0): Linear(in_features=1472, out_features=3584, bias=False) |
|
(wi_1): Linear(in_features=1472, out_features=3584, bias=False) |
|
(wo): Linear(in_features=3584, out_features=1472, bias=False) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
(act): NewGELUActivation() |
|
) |
|
(layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(final_layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=1472, out_features=25, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-06 15:36:54,882 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,882 MultiCorpus: 1214 train + 266 dev + 251 test sentences |
|
- NER_HIPE_2022 Corpus: 1214 train + 266 dev + 251 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/ajmc/en/with_doc_seperator |
|
2023-10-06 15:36:54,882 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,882 Train: 1214 sentences |
|
2023-10-06 15:36:54,882 (train_with_dev=False, train_with_test=False) |
|
2023-10-06 15:36:54,883 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,883 Training Params: |
|
2023-10-06 15:36:54,883 - learning_rate: "0.00016" |
|
2023-10-06 15:36:54,883 - mini_batch_size: "4" |
|
2023-10-06 15:36:54,883 - max_epochs: "10" |
|
2023-10-06 15:36:54,883 - shuffle: "True" |
|
2023-10-06 15:36:54,883 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,883 Plugins: |
|
2023-10-06 15:36:54,883 - TensorboardLogger |
|
2023-10-06 15:36:54,883 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-06 15:36:54,883 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,883 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-06 15:36:54,883 - metric: "('micro avg', 'f1-score')" |
|
2023-10-06 15:36:54,883 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,883 Computation: |
|
2023-10-06 15:36:54,883 - compute on device: cuda:0 |
|
2023-10-06 15:36:54,883 - embedding storage: none |
|
2023-10-06 15:36:54,884 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,884 Model training base path: "hmbench-ajmc/en-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5" |
|
2023-10-06 15:36:54,884 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,884 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:36:54,884 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-06 15:37:06,733 epoch 1 - iter 30/304 - loss 3.23322094 - time (sec): 11.85 - samples/sec: 269.84 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-06 15:37:18,012 epoch 1 - iter 60/304 - loss 3.22188262 - time (sec): 23.13 - samples/sec: 266.48 - lr: 0.000031 - momentum: 0.000000 |
|
2023-10-06 15:37:29,580 epoch 1 - iter 90/304 - loss 3.19289241 - time (sec): 34.69 - samples/sec: 258.25 - lr: 0.000047 - momentum: 0.000000 |
|
2023-10-06 15:37:41,327 epoch 1 - iter 120/304 - loss 3.12149688 - time (sec): 46.44 - samples/sec: 256.97 - lr: 0.000063 - momentum: 0.000000 |
|
2023-10-06 15:37:53,113 epoch 1 - iter 150/304 - loss 3.01764722 - time (sec): 58.23 - samples/sec: 255.74 - lr: 0.000078 - momentum: 0.000000 |
|
2023-10-06 15:38:05,859 epoch 1 - iter 180/304 - loss 2.88619671 - time (sec): 70.97 - samples/sec: 258.03 - lr: 0.000094 - momentum: 0.000000 |
|
2023-10-06 15:38:16,746 epoch 1 - iter 210/304 - loss 2.77838030 - time (sec): 81.86 - samples/sec: 254.96 - lr: 0.000110 - momentum: 0.000000 |
|
2023-10-06 15:38:29,102 epoch 1 - iter 240/304 - loss 2.63128465 - time (sec): 94.22 - samples/sec: 255.26 - lr: 0.000126 - momentum: 0.000000 |
|
2023-10-06 15:38:41,231 epoch 1 - iter 270/304 - loss 2.47848072 - time (sec): 106.35 - samples/sec: 255.99 - lr: 0.000142 - momentum: 0.000000 |
|
2023-10-06 15:38:53,817 epoch 1 - iter 300/304 - loss 2.32263470 - time (sec): 118.93 - samples/sec: 257.03 - lr: 0.000157 - momentum: 0.000000 |
|
2023-10-06 15:38:55,380 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:38:55,380 EPOCH 1 done: loss 2.3027 - lr: 0.000157 |
|
2023-10-06 15:39:03,100 DEV : loss 0.8800114989280701 - f1-score (micro avg) 0.0 |
|
2023-10-06 15:39:03,107 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:39:15,065 epoch 2 - iter 30/304 - loss 0.78322681 - time (sec): 11.96 - samples/sec: 256.26 - lr: 0.000158 - momentum: 0.000000 |
|
2023-10-06 15:39:26,739 epoch 2 - iter 60/304 - loss 0.72767057 - time (sec): 23.63 - samples/sec: 258.01 - lr: 0.000157 - momentum: 0.000000 |
|
2023-10-06 15:39:39,354 epoch 2 - iter 90/304 - loss 0.71491699 - time (sec): 36.25 - samples/sec: 256.99 - lr: 0.000155 - momentum: 0.000000 |
|
2023-10-06 15:39:51,537 epoch 2 - iter 120/304 - loss 0.66298949 - time (sec): 48.43 - samples/sec: 258.50 - lr: 0.000153 - momentum: 0.000000 |
|
2023-10-06 15:40:01,824 epoch 2 - iter 150/304 - loss 0.62149707 - time (sec): 58.72 - samples/sec: 256.76 - lr: 0.000151 - momentum: 0.000000 |
|
2023-10-06 15:40:12,871 epoch 2 - iter 180/304 - loss 0.57555139 - time (sec): 69.76 - samples/sec: 259.55 - lr: 0.000150 - momentum: 0.000000 |
|
2023-10-06 15:40:24,343 epoch 2 - iter 210/304 - loss 0.55783175 - time (sec): 81.23 - samples/sec: 260.05 - lr: 0.000148 - momentum: 0.000000 |
|
2023-10-06 15:40:35,599 epoch 2 - iter 240/304 - loss 0.52723917 - time (sec): 92.49 - samples/sec: 262.39 - lr: 0.000146 - momentum: 0.000000 |
|
2023-10-06 15:40:47,204 epoch 2 - iter 270/304 - loss 0.50036839 - time (sec): 104.10 - samples/sec: 264.01 - lr: 0.000144 - momentum: 0.000000 |
|
2023-10-06 15:40:58,458 epoch 2 - iter 300/304 - loss 0.47802664 - time (sec): 115.35 - samples/sec: 265.43 - lr: 0.000143 - momentum: 0.000000 |
|
2023-10-06 15:40:59,845 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:40:59,845 EPOCH 2 done: loss 0.4776 - lr: 0.000143 |
|
2023-10-06 15:41:06,852 DEV : loss 0.3180970847606659 - f1-score (micro avg) 0.4756 |
|
2023-10-06 15:41:06,858 saving best model |
|
2023-10-06 15:41:07,687 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:41:18,649 epoch 3 - iter 30/304 - loss 0.31951679 - time (sec): 10.96 - samples/sec: 263.60 - lr: 0.000141 - momentum: 0.000000 |
|
2023-10-06 15:41:29,646 epoch 3 - iter 60/304 - loss 0.27399777 - time (sec): 21.96 - samples/sec: 266.88 - lr: 0.000139 - momentum: 0.000000 |
|
2023-10-06 15:41:40,074 epoch 3 - iter 90/304 - loss 0.24835641 - time (sec): 32.38 - samples/sec: 265.71 - lr: 0.000137 - momentum: 0.000000 |
|
2023-10-06 15:41:51,842 epoch 3 - iter 120/304 - loss 0.24689103 - time (sec): 44.15 - samples/sec: 271.15 - lr: 0.000135 - momentum: 0.000000 |
|
2023-10-06 15:42:02,829 epoch 3 - iter 150/304 - loss 0.23461259 - time (sec): 55.14 - samples/sec: 270.08 - lr: 0.000134 - momentum: 0.000000 |
|
2023-10-06 15:42:14,182 epoch 3 - iter 180/304 - loss 0.23128760 - time (sec): 66.49 - samples/sec: 270.04 - lr: 0.000132 - momentum: 0.000000 |
|
2023-10-06 15:42:25,602 epoch 3 - iter 210/304 - loss 0.22156764 - time (sec): 77.91 - samples/sec: 270.77 - lr: 0.000130 - momentum: 0.000000 |
|
2023-10-06 15:42:37,128 epoch 3 - iter 240/304 - loss 0.21366743 - time (sec): 89.44 - samples/sec: 271.39 - lr: 0.000128 - momentum: 0.000000 |
|
2023-10-06 15:42:48,275 epoch 3 - iter 270/304 - loss 0.20493924 - time (sec): 100.59 - samples/sec: 271.47 - lr: 0.000127 - momentum: 0.000000 |
|
2023-10-06 15:42:59,999 epoch 3 - iter 300/304 - loss 0.19796880 - time (sec): 112.31 - samples/sec: 271.94 - lr: 0.000125 - momentum: 0.000000 |
|
2023-10-06 15:43:01,558 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:43:01,558 EPOCH 3 done: loss 0.1975 - lr: 0.000125 |
|
2023-10-06 15:43:08,750 DEV : loss 0.184345543384552 - f1-score (micro avg) 0.6873 |
|
2023-10-06 15:43:08,757 saving best model |
|
2023-10-06 15:43:13,063 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:43:24,872 epoch 4 - iter 30/304 - loss 0.15311988 - time (sec): 11.81 - samples/sec: 268.15 - lr: 0.000123 - momentum: 0.000000 |
|
2023-10-06 15:43:35,640 epoch 4 - iter 60/304 - loss 0.14346239 - time (sec): 22.57 - samples/sec: 262.24 - lr: 0.000121 - momentum: 0.000000 |
|
2023-10-06 15:43:46,925 epoch 4 - iter 90/304 - loss 0.13496503 - time (sec): 33.86 - samples/sec: 264.56 - lr: 0.000119 - momentum: 0.000000 |
|
2023-10-06 15:43:58,485 epoch 4 - iter 120/304 - loss 0.13053640 - time (sec): 45.42 - samples/sec: 264.62 - lr: 0.000118 - momentum: 0.000000 |
|
2023-10-06 15:44:10,259 epoch 4 - iter 150/304 - loss 0.12961165 - time (sec): 57.19 - samples/sec: 264.22 - lr: 0.000116 - momentum: 0.000000 |
|
2023-10-06 15:44:21,869 epoch 4 - iter 180/304 - loss 0.12156178 - time (sec): 68.80 - samples/sec: 264.02 - lr: 0.000114 - momentum: 0.000000 |
|
2023-10-06 15:44:33,279 epoch 4 - iter 210/304 - loss 0.11898423 - time (sec): 80.21 - samples/sec: 262.79 - lr: 0.000112 - momentum: 0.000000 |
|
2023-10-06 15:44:45,927 epoch 4 - iter 240/304 - loss 0.11525286 - time (sec): 92.86 - samples/sec: 265.18 - lr: 0.000111 - momentum: 0.000000 |
|
2023-10-06 15:44:58,587 epoch 4 - iter 270/304 - loss 0.11334956 - time (sec): 105.52 - samples/sec: 264.46 - lr: 0.000109 - momentum: 0.000000 |
|
2023-10-06 15:45:09,784 epoch 4 - iter 300/304 - loss 0.10875073 - time (sec): 116.72 - samples/sec: 262.60 - lr: 0.000107 - momentum: 0.000000 |
|
2023-10-06 15:45:11,182 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:45:11,183 EPOCH 4 done: loss 0.1081 - lr: 0.000107 |
|
2023-10-06 15:45:19,184 DEV : loss 0.1357320249080658 - f1-score (micro avg) 0.816 |
|
2023-10-06 15:45:19,192 saving best model |
|
2023-10-06 15:45:23,511 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:45:35,491 epoch 5 - iter 30/304 - loss 0.04930699 - time (sec): 11.98 - samples/sec: 256.70 - lr: 0.000105 - momentum: 0.000000 |
|
2023-10-06 15:45:47,151 epoch 5 - iter 60/304 - loss 0.07126682 - time (sec): 23.64 - samples/sec: 256.67 - lr: 0.000103 - momentum: 0.000000 |
|
2023-10-06 15:45:59,609 epoch 5 - iter 90/304 - loss 0.07174582 - time (sec): 36.10 - samples/sec: 256.34 - lr: 0.000102 - momentum: 0.000000 |
|
2023-10-06 15:46:11,788 epoch 5 - iter 120/304 - loss 0.06935646 - time (sec): 48.27 - samples/sec: 257.19 - lr: 0.000100 - momentum: 0.000000 |
|
2023-10-06 15:46:24,076 epoch 5 - iter 150/304 - loss 0.07373860 - time (sec): 60.56 - samples/sec: 257.80 - lr: 0.000098 - momentum: 0.000000 |
|
2023-10-06 15:46:35,831 epoch 5 - iter 180/304 - loss 0.06852216 - time (sec): 72.32 - samples/sec: 256.45 - lr: 0.000096 - momentum: 0.000000 |
|
2023-10-06 15:46:48,151 epoch 5 - iter 210/304 - loss 0.06505765 - time (sec): 84.64 - samples/sec: 256.89 - lr: 0.000094 - momentum: 0.000000 |
|
2023-10-06 15:47:00,070 epoch 5 - iter 240/304 - loss 0.06581915 - time (sec): 96.56 - samples/sec: 257.94 - lr: 0.000093 - momentum: 0.000000 |
|
2023-10-06 15:47:11,797 epoch 5 - iter 270/304 - loss 0.06376206 - time (sec): 108.28 - samples/sec: 257.20 - lr: 0.000091 - momentum: 0.000000 |
|
2023-10-06 15:47:23,426 epoch 5 - iter 300/304 - loss 0.06521827 - time (sec): 119.91 - samples/sec: 256.54 - lr: 0.000089 - momentum: 0.000000 |
|
2023-10-06 15:47:24,581 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:47:24,581 EPOCH 5 done: loss 0.0649 - lr: 0.000089 |
|
2023-10-06 15:47:32,312 DEV : loss 0.14172478020191193 - f1-score (micro avg) 0.7991 |
|
2023-10-06 15:47:32,318 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:47:44,395 epoch 6 - iter 30/304 - loss 0.06900995 - time (sec): 12.08 - samples/sec: 263.68 - lr: 0.000087 - momentum: 0.000000 |
|
2023-10-06 15:47:56,682 epoch 6 - iter 60/304 - loss 0.06261892 - time (sec): 24.36 - samples/sec: 264.26 - lr: 0.000085 - momentum: 0.000000 |
|
2023-10-06 15:48:09,164 epoch 6 - iter 90/304 - loss 0.05549560 - time (sec): 36.84 - samples/sec: 264.22 - lr: 0.000084 - momentum: 0.000000 |
|
2023-10-06 15:48:21,345 epoch 6 - iter 120/304 - loss 0.05286815 - time (sec): 49.03 - samples/sec: 261.99 - lr: 0.000082 - momentum: 0.000000 |
|
2023-10-06 15:48:33,228 epoch 6 - iter 150/304 - loss 0.04820134 - time (sec): 60.91 - samples/sec: 261.39 - lr: 0.000080 - momentum: 0.000000 |
|
2023-10-06 15:48:44,239 epoch 6 - iter 180/304 - loss 0.04438212 - time (sec): 71.92 - samples/sec: 258.41 - lr: 0.000078 - momentum: 0.000000 |
|
2023-10-06 15:48:56,652 epoch 6 - iter 210/304 - loss 0.04595305 - time (sec): 84.33 - samples/sec: 258.91 - lr: 0.000077 - momentum: 0.000000 |
|
2023-10-06 15:49:08,204 epoch 6 - iter 240/304 - loss 0.04631832 - time (sec): 95.88 - samples/sec: 257.53 - lr: 0.000075 - momentum: 0.000000 |
|
2023-10-06 15:49:20,203 epoch 6 - iter 270/304 - loss 0.04490989 - time (sec): 107.88 - samples/sec: 255.86 - lr: 0.000073 - momentum: 0.000000 |
|
2023-10-06 15:49:32,333 epoch 6 - iter 300/304 - loss 0.04721332 - time (sec): 120.01 - samples/sec: 255.61 - lr: 0.000071 - momentum: 0.000000 |
|
2023-10-06 15:49:33,641 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:49:33,641 EPOCH 6 done: loss 0.0484 - lr: 0.000071 |
|
2023-10-06 15:49:41,541 DEV : loss 0.14471642673015594 - f1-score (micro avg) 0.8208 |
|
2023-10-06 15:49:41,548 saving best model |
|
2023-10-06 15:49:45,868 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:49:57,395 epoch 7 - iter 30/304 - loss 0.03112703 - time (sec): 11.53 - samples/sec: 248.84 - lr: 0.000069 - momentum: 0.000000 |
|
2023-10-06 15:50:09,035 epoch 7 - iter 60/304 - loss 0.03500998 - time (sec): 23.16 - samples/sec: 248.39 - lr: 0.000068 - momentum: 0.000000 |
|
2023-10-06 15:50:21,004 epoch 7 - iter 90/304 - loss 0.02979522 - time (sec): 35.13 - samples/sec: 249.33 - lr: 0.000066 - momentum: 0.000000 |
|
2023-10-06 15:50:32,990 epoch 7 - iter 120/304 - loss 0.03007276 - time (sec): 47.12 - samples/sec: 251.04 - lr: 0.000064 - momentum: 0.000000 |
|
2023-10-06 15:50:44,973 epoch 7 - iter 150/304 - loss 0.02904381 - time (sec): 59.10 - samples/sec: 250.68 - lr: 0.000062 - momentum: 0.000000 |
|
2023-10-06 15:50:57,542 epoch 7 - iter 180/304 - loss 0.03211479 - time (sec): 71.67 - samples/sec: 252.55 - lr: 0.000061 - momentum: 0.000000 |
|
2023-10-06 15:51:09,551 epoch 7 - iter 210/304 - loss 0.03134543 - time (sec): 83.68 - samples/sec: 253.38 - lr: 0.000059 - momentum: 0.000000 |
|
2023-10-06 15:51:21,545 epoch 7 - iter 240/304 - loss 0.03216970 - time (sec): 95.67 - samples/sec: 252.58 - lr: 0.000057 - momentum: 0.000000 |
|
2023-10-06 15:51:34,108 epoch 7 - iter 270/304 - loss 0.03210527 - time (sec): 108.24 - samples/sec: 254.12 - lr: 0.000055 - momentum: 0.000000 |
|
2023-10-06 15:51:46,256 epoch 7 - iter 300/304 - loss 0.03929255 - time (sec): 120.39 - samples/sec: 254.28 - lr: 0.000054 - momentum: 0.000000 |
|
2023-10-06 15:51:47,718 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:51:47,719 EPOCH 7 done: loss 0.0388 - lr: 0.000054 |
|
2023-10-06 15:51:55,710 DEV : loss 0.15623363852500916 - f1-score (micro avg) 0.8316 |
|
2023-10-06 15:51:55,718 saving best model |
|
2023-10-06 15:52:00,038 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:52:12,017 epoch 8 - iter 30/304 - loss 0.04538668 - time (sec): 11.98 - samples/sec: 252.89 - lr: 0.000052 - momentum: 0.000000 |
|
2023-10-06 15:52:23,734 epoch 8 - iter 60/304 - loss 0.04340644 - time (sec): 23.69 - samples/sec: 252.12 - lr: 0.000050 - momentum: 0.000000 |
|
2023-10-06 15:52:35,070 epoch 8 - iter 90/304 - loss 0.03841906 - time (sec): 35.03 - samples/sec: 248.27 - lr: 0.000048 - momentum: 0.000000 |
|
2023-10-06 15:52:47,018 epoch 8 - iter 120/304 - loss 0.03922015 - time (sec): 46.98 - samples/sec: 250.61 - lr: 0.000046 - momentum: 0.000000 |
|
2023-10-06 15:52:59,419 epoch 8 - iter 150/304 - loss 0.03629845 - time (sec): 59.38 - samples/sec: 252.41 - lr: 0.000045 - momentum: 0.000000 |
|
2023-10-06 15:53:11,296 epoch 8 - iter 180/304 - loss 0.03414241 - time (sec): 71.26 - samples/sec: 251.94 - lr: 0.000043 - momentum: 0.000000 |
|
2023-10-06 15:53:23,608 epoch 8 - iter 210/304 - loss 0.03144849 - time (sec): 83.57 - samples/sec: 253.58 - lr: 0.000041 - momentum: 0.000000 |
|
2023-10-06 15:53:36,172 epoch 8 - iter 240/304 - loss 0.03112508 - time (sec): 96.13 - samples/sec: 255.89 - lr: 0.000039 - momentum: 0.000000 |
|
2023-10-06 15:53:48,275 epoch 8 - iter 270/304 - loss 0.03257122 - time (sec): 108.24 - samples/sec: 256.62 - lr: 0.000038 - momentum: 0.000000 |
|
2023-10-06 15:53:59,628 epoch 8 - iter 300/304 - loss 0.03300308 - time (sec): 119.59 - samples/sec: 255.66 - lr: 0.000036 - momentum: 0.000000 |
|
2023-10-06 15:54:01,265 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:54:01,266 EPOCH 8 done: loss 0.0329 - lr: 0.000036 |
|
2023-10-06 15:54:09,139 DEV : loss 0.15997229516506195 - f1-score (micro avg) 0.8379 |
|
2023-10-06 15:54:09,147 saving best model |
|
2023-10-06 15:54:13,485 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:54:25,850 epoch 9 - iter 30/304 - loss 0.03849688 - time (sec): 12.36 - samples/sec: 273.29 - lr: 0.000034 - momentum: 0.000000 |
|
2023-10-06 15:54:37,879 epoch 9 - iter 60/304 - loss 0.03428552 - time (sec): 24.39 - samples/sec: 269.06 - lr: 0.000032 - momentum: 0.000000 |
|
2023-10-06 15:54:49,432 epoch 9 - iter 90/304 - loss 0.03229835 - time (sec): 35.95 - samples/sec: 264.90 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-06 15:55:01,065 epoch 9 - iter 120/304 - loss 0.02975385 - time (sec): 47.58 - samples/sec: 260.62 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-06 15:55:13,022 epoch 9 - iter 150/304 - loss 0.02627761 - time (sec): 59.54 - samples/sec: 257.51 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-06 15:55:25,259 epoch 9 - iter 180/304 - loss 0.02710223 - time (sec): 71.77 - samples/sec: 257.30 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-06 15:55:37,771 epoch 9 - iter 210/304 - loss 0.02532150 - time (sec): 84.28 - samples/sec: 257.25 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-06 15:55:49,592 epoch 9 - iter 240/304 - loss 0.02867009 - time (sec): 96.11 - samples/sec: 255.81 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-06 15:56:01,310 epoch 9 - iter 270/304 - loss 0.02617301 - time (sec): 107.82 - samples/sec: 255.88 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-06 15:56:13,354 epoch 9 - iter 300/304 - loss 0.02569570 - time (sec): 119.87 - samples/sec: 255.38 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-06 15:56:14,735 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:56:14,736 EPOCH 9 done: loss 0.0263 - lr: 0.000018 |
|
2023-10-06 15:56:22,592 DEV : loss 0.16357897222042084 - f1-score (micro avg) 0.8443 |
|
2023-10-06 15:56:22,599 saving best model |
|
2023-10-06 15:56:26,917 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:56:38,528 epoch 10 - iter 30/304 - loss 0.03508030 - time (sec): 11.61 - samples/sec: 247.56 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-06 15:56:51,462 epoch 10 - iter 60/304 - loss 0.03055439 - time (sec): 24.54 - samples/sec: 259.78 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-06 15:57:03,159 epoch 10 - iter 90/304 - loss 0.02707793 - time (sec): 36.24 - samples/sec: 258.47 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-06 15:57:14,883 epoch 10 - iter 120/304 - loss 0.02382844 - time (sec): 47.96 - samples/sec: 257.65 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-06 15:57:26,168 epoch 10 - iter 150/304 - loss 0.02697956 - time (sec): 59.25 - samples/sec: 255.31 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-06 15:57:38,175 epoch 10 - iter 180/304 - loss 0.02428988 - time (sec): 71.26 - samples/sec: 256.39 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-06 15:57:49,550 epoch 10 - iter 210/304 - loss 0.02197756 - time (sec): 82.63 - samples/sec: 255.17 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-06 15:58:01,485 epoch 10 - iter 240/304 - loss 0.02195724 - time (sec): 94.57 - samples/sec: 257.10 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-06 15:58:13,044 epoch 10 - iter 270/304 - loss 0.02154227 - time (sec): 106.13 - samples/sec: 259.09 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-06 15:58:24,617 epoch 10 - iter 300/304 - loss 0.02330141 - time (sec): 117.70 - samples/sec: 261.07 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-06 15:58:25,767 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:58:25,767 EPOCH 10 done: loss 0.0240 - lr: 0.000000 |
|
2023-10-06 15:58:32,973 DEV : loss 0.16605204343795776 - f1-score (micro avg) 0.8426 |
|
2023-10-06 15:58:33,817 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 15:58:33,819 Loading model from best epoch ... |
|
2023-10-06 15:58:37,445 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-date, B-date, E-date, I-date, S-object, B-object, E-object, I-object |
|
2023-10-06 15:58:44,032 |
|
Results: |
|
- F-score (micro) 0.8 |
|
- F-score (macro) 0.6473 |
|
- Accuracy 0.6744 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
scope 0.7372 0.7616 0.7492 151 |
|
pers 0.8198 0.9479 0.8792 96 |
|
work 0.7593 0.8632 0.8079 95 |
|
loc 1.0000 0.6667 0.8000 3 |
|
date 0.0000 0.0000 0.0000 3 |
|
|
|
micro avg 0.7692 0.8333 0.8000 348 |
|
macro avg 0.6633 0.6479 0.6473 348 |
|
weighted avg 0.7619 0.8333 0.7951 348 |
|
|
|
2023-10-06 15:58:44,032 ---------------------------------------------------------------------------------------------------- |
|
|