|
2023-10-06 13:49:38,621 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,623 Model: "SequenceTagger( |
|
(embeddings): ByT5Embeddings( |
|
(model): T5EncoderModel( |
|
(shared): Embedding(384, 1472) |
|
(encoder): T5Stack( |
|
(embed_tokens): Embedding(384, 1472) |
|
(block): ModuleList( |
|
(0): T5Block( |
|
(layer): ModuleList( |
|
(0): T5LayerSelfAttention( |
|
(SelfAttention): T5Attention( |
|
(q): Linear(in_features=1472, out_features=384, bias=False) |
|
(k): Linear(in_features=1472, out_features=384, bias=False) |
|
(v): Linear(in_features=1472, out_features=384, bias=False) |
|
(o): Linear(in_features=384, out_features=1472, bias=False) |
|
(relative_attention_bias): Embedding(32, 6) |
|
) |
|
(layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(1): T5LayerFF( |
|
(DenseReluDense): T5DenseGatedActDense( |
|
(wi_0): Linear(in_features=1472, out_features=3584, bias=False) |
|
(wi_1): Linear(in_features=1472, out_features=3584, bias=False) |
|
(wo): Linear(in_features=3584, out_features=1472, bias=False) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
(act): NewGELUActivation() |
|
) |
|
(layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
(1-11): 11 x T5Block( |
|
(layer): ModuleList( |
|
(0): T5LayerSelfAttention( |
|
(SelfAttention): T5Attention( |
|
(q): Linear(in_features=1472, out_features=384, bias=False) |
|
(k): Linear(in_features=1472, out_features=384, bias=False) |
|
(v): Linear(in_features=1472, out_features=384, bias=False) |
|
(o): Linear(in_features=384, out_features=1472, bias=False) |
|
) |
|
(layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(1): T5LayerFF( |
|
(DenseReluDense): T5DenseGatedActDense( |
|
(wi_0): Linear(in_features=1472, out_features=3584, bias=False) |
|
(wi_1): Linear(in_features=1472, out_features=3584, bias=False) |
|
(wo): Linear(in_features=3584, out_features=1472, bias=False) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
(act): NewGELUActivation() |
|
) |
|
(layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(final_layer_norm): T5LayerNorm() |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=1472, out_features=25, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-06 13:49:38,623 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,623 MultiCorpus: 1214 train + 266 dev + 251 test sentences |
|
- NER_HIPE_2022 Corpus: 1214 train + 266 dev + 251 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/ajmc/en/with_doc_seperator |
|
2023-10-06 13:49:38,623 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,623 Train: 1214 sentences |
|
2023-10-06 13:49:38,623 (train_with_dev=False, train_with_test=False) |
|
2023-10-06 13:49:38,623 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,623 Training Params: |
|
2023-10-06 13:49:38,623 - learning_rate: "0.00015" |
|
2023-10-06 13:49:38,623 - mini_batch_size: "4" |
|
2023-10-06 13:49:38,623 - max_epochs: "10" |
|
2023-10-06 13:49:38,623 - shuffle: "True" |
|
2023-10-06 13:49:38,624 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,624 Plugins: |
|
2023-10-06 13:49:38,624 - TensorboardLogger |
|
2023-10-06 13:49:38,624 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-06 13:49:38,624 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,624 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-06 13:49:38,624 - metric: "('micro avg', 'f1-score')" |
|
2023-10-06 13:49:38,624 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,624 Computation: |
|
2023-10-06 13:49:38,624 - compute on device: cuda:0 |
|
2023-10-06 13:49:38,624 - embedding storage: none |
|
2023-10-06 13:49:38,624 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,624 Model training base path: "hmbench-ajmc/en-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4" |
|
2023-10-06 13:49:38,624 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,624 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:49:38,624 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-06 13:49:49,767 epoch 1 - iter 30/304 - loss 3.24818326 - time (sec): 11.14 - samples/sec: 275.00 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-06 13:50:01,244 epoch 1 - iter 60/304 - loss 3.23760645 - time (sec): 22.62 - samples/sec: 272.88 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-06 13:50:12,396 epoch 1 - iter 90/304 - loss 3.21498944 - time (sec): 33.77 - samples/sec: 271.13 - lr: 0.000044 - momentum: 0.000000 |
|
2023-10-06 13:50:23,740 epoch 1 - iter 120/304 - loss 3.14742015 - time (sec): 45.11 - samples/sec: 271.49 - lr: 0.000059 - momentum: 0.000000 |
|
2023-10-06 13:50:35,173 epoch 1 - iter 150/304 - loss 3.03700319 - time (sec): 56.55 - samples/sec: 274.39 - lr: 0.000074 - momentum: 0.000000 |
|
2023-10-06 13:50:46,446 epoch 1 - iter 180/304 - loss 2.92856705 - time (sec): 67.82 - samples/sec: 273.86 - lr: 0.000088 - momentum: 0.000000 |
|
2023-10-06 13:50:57,374 epoch 1 - iter 210/304 - loss 2.80752085 - time (sec): 78.75 - samples/sec: 273.01 - lr: 0.000103 - momentum: 0.000000 |
|
2023-10-06 13:51:08,505 epoch 1 - iter 240/304 - loss 2.68047368 - time (sec): 89.88 - samples/sec: 271.54 - lr: 0.000118 - momentum: 0.000000 |
|
2023-10-06 13:51:20,364 epoch 1 - iter 270/304 - loss 2.52753778 - time (sec): 101.74 - samples/sec: 272.48 - lr: 0.000133 - momentum: 0.000000 |
|
2023-10-06 13:51:31,231 epoch 1 - iter 300/304 - loss 2.39259859 - time (sec): 112.61 - samples/sec: 271.44 - lr: 0.000148 - momentum: 0.000000 |
|
2023-10-06 13:51:32,737 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:51:32,738 EPOCH 1 done: loss 2.3760 - lr: 0.000148 |
|
2023-10-06 13:51:39,775 DEV : loss 0.9394416809082031 - f1-score (micro avg) 0.0 |
|
2023-10-06 13:51:39,783 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:51:51,527 epoch 2 - iter 30/304 - loss 0.79917105 - time (sec): 11.74 - samples/sec: 274.22 - lr: 0.000148 - momentum: 0.000000 |
|
2023-10-06 13:52:02,875 epoch 2 - iter 60/304 - loss 0.72892232 - time (sec): 23.09 - samples/sec: 269.89 - lr: 0.000147 - momentum: 0.000000 |
|
2023-10-06 13:52:14,478 epoch 2 - iter 90/304 - loss 0.70892835 - time (sec): 34.69 - samples/sec: 267.51 - lr: 0.000145 - momentum: 0.000000 |
|
2023-10-06 13:52:26,343 epoch 2 - iter 120/304 - loss 0.67296797 - time (sec): 46.56 - samples/sec: 269.85 - lr: 0.000143 - momentum: 0.000000 |
|
2023-10-06 13:52:37,493 epoch 2 - iter 150/304 - loss 0.64302774 - time (sec): 57.71 - samples/sec: 269.06 - lr: 0.000142 - momentum: 0.000000 |
|
2023-10-06 13:52:49,097 epoch 2 - iter 180/304 - loss 0.60152073 - time (sec): 69.31 - samples/sec: 267.59 - lr: 0.000140 - momentum: 0.000000 |
|
2023-10-06 13:53:00,744 epoch 2 - iter 210/304 - loss 0.57720718 - time (sec): 80.96 - samples/sec: 266.86 - lr: 0.000139 - momentum: 0.000000 |
|
2023-10-06 13:53:12,134 epoch 2 - iter 240/304 - loss 0.54230696 - time (sec): 92.35 - samples/sec: 266.22 - lr: 0.000137 - momentum: 0.000000 |
|
2023-10-06 13:53:24,268 epoch 2 - iter 270/304 - loss 0.52150856 - time (sec): 104.48 - samples/sec: 266.97 - lr: 0.000135 - momentum: 0.000000 |
|
2023-10-06 13:53:35,561 epoch 2 - iter 300/304 - loss 0.49993589 - time (sec): 115.78 - samples/sec: 265.47 - lr: 0.000134 - momentum: 0.000000 |
|
2023-10-06 13:53:36,757 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:53:36,757 EPOCH 2 done: loss 0.4979 - lr: 0.000134 |
|
2023-10-06 13:53:44,573 DEV : loss 0.32948073744773865 - f1-score (micro avg) 0.4917 |
|
2023-10-06 13:53:44,579 saving best model |
|
2023-10-06 13:53:45,420 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:53:57,176 epoch 3 - iter 30/304 - loss 0.22935182 - time (sec): 11.75 - samples/sec: 253.78 - lr: 0.000132 - momentum: 0.000000 |
|
2023-10-06 13:54:09,534 epoch 3 - iter 60/304 - loss 0.23267930 - time (sec): 24.11 - samples/sec: 256.05 - lr: 0.000130 - momentum: 0.000000 |
|
2023-10-06 13:54:20,893 epoch 3 - iter 90/304 - loss 0.23638818 - time (sec): 35.47 - samples/sec: 250.96 - lr: 0.000128 - momentum: 0.000000 |
|
2023-10-06 13:54:32,624 epoch 3 - iter 120/304 - loss 0.23744099 - time (sec): 47.20 - samples/sec: 249.95 - lr: 0.000127 - momentum: 0.000000 |
|
2023-10-06 13:54:45,316 epoch 3 - iter 150/304 - loss 0.22619796 - time (sec): 59.89 - samples/sec: 252.73 - lr: 0.000125 - momentum: 0.000000 |
|
2023-10-06 13:54:56,975 epoch 3 - iter 180/304 - loss 0.22219277 - time (sec): 71.55 - samples/sec: 251.95 - lr: 0.000124 - momentum: 0.000000 |
|
2023-10-06 13:55:09,166 epoch 3 - iter 210/304 - loss 0.21681192 - time (sec): 83.74 - samples/sec: 254.59 - lr: 0.000122 - momentum: 0.000000 |
|
2023-10-06 13:55:21,361 epoch 3 - iter 240/304 - loss 0.21443892 - time (sec): 95.94 - samples/sec: 254.72 - lr: 0.000120 - momentum: 0.000000 |
|
2023-10-06 13:55:33,623 epoch 3 - iter 270/304 - loss 0.20935067 - time (sec): 108.20 - samples/sec: 255.59 - lr: 0.000119 - momentum: 0.000000 |
|
2023-10-06 13:55:45,229 epoch 3 - iter 300/304 - loss 0.20574332 - time (sec): 119.81 - samples/sec: 254.96 - lr: 0.000117 - momentum: 0.000000 |
|
2023-10-06 13:55:46,883 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:55:46,883 EPOCH 3 done: loss 0.2055 - lr: 0.000117 |
|
2023-10-06 13:55:54,831 DEV : loss 0.19269458949565887 - f1-score (micro avg) 0.71 |
|
2023-10-06 13:55:54,840 saving best model |
|
2023-10-06 13:55:59,154 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:56:10,711 epoch 4 - iter 30/304 - loss 0.13367057 - time (sec): 11.56 - samples/sec: 255.21 - lr: 0.000115 - momentum: 0.000000 |
|
2023-10-06 13:56:22,816 epoch 4 - iter 60/304 - loss 0.13262848 - time (sec): 23.66 - samples/sec: 259.68 - lr: 0.000113 - momentum: 0.000000 |
|
2023-10-06 13:56:34,277 epoch 4 - iter 90/304 - loss 0.12628124 - time (sec): 35.12 - samples/sec: 254.55 - lr: 0.000112 - momentum: 0.000000 |
|
2023-10-06 13:56:46,290 epoch 4 - iter 120/304 - loss 0.11896909 - time (sec): 47.13 - samples/sec: 252.60 - lr: 0.000110 - momentum: 0.000000 |
|
2023-10-06 13:56:58,170 epoch 4 - iter 150/304 - loss 0.11494074 - time (sec): 59.01 - samples/sec: 252.30 - lr: 0.000109 - momentum: 0.000000 |
|
2023-10-06 13:57:10,466 epoch 4 - iter 180/304 - loss 0.11586571 - time (sec): 71.31 - samples/sec: 252.74 - lr: 0.000107 - momentum: 0.000000 |
|
2023-10-06 13:57:22,759 epoch 4 - iter 210/304 - loss 0.11851043 - time (sec): 83.60 - samples/sec: 254.52 - lr: 0.000105 - momentum: 0.000000 |
|
2023-10-06 13:57:34,761 epoch 4 - iter 240/304 - loss 0.11817188 - time (sec): 95.60 - samples/sec: 255.13 - lr: 0.000104 - momentum: 0.000000 |
|
2023-10-06 13:57:47,095 epoch 4 - iter 270/304 - loss 0.11704453 - time (sec): 107.94 - samples/sec: 256.09 - lr: 0.000102 - momentum: 0.000000 |
|
2023-10-06 13:57:59,202 epoch 4 - iter 300/304 - loss 0.11282138 - time (sec): 120.05 - samples/sec: 256.09 - lr: 0.000100 - momentum: 0.000000 |
|
2023-10-06 13:58:00,407 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:58:00,407 EPOCH 4 done: loss 0.1126 - lr: 0.000100 |
|
2023-10-06 13:58:08,218 DEV : loss 0.15201449394226074 - f1-score (micro avg) 0.8214 |
|
2023-10-06 13:58:08,225 saving best model |
|
2023-10-06 13:58:12,565 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 13:58:24,258 epoch 5 - iter 30/304 - loss 0.06621718 - time (sec): 11.69 - samples/sec: 253.78 - lr: 0.000098 - momentum: 0.000000 |
|
2023-10-06 13:58:36,672 epoch 5 - iter 60/304 - loss 0.06740122 - time (sec): 24.11 - samples/sec: 259.78 - lr: 0.000097 - momentum: 0.000000 |
|
2023-10-06 13:58:48,744 epoch 5 - iter 90/304 - loss 0.06811196 - time (sec): 36.18 - samples/sec: 259.80 - lr: 0.000095 - momentum: 0.000000 |
|
2023-10-06 13:59:00,235 epoch 5 - iter 120/304 - loss 0.06656845 - time (sec): 47.67 - samples/sec: 258.66 - lr: 0.000094 - momentum: 0.000000 |
|
2023-10-06 13:59:11,731 epoch 5 - iter 150/304 - loss 0.06460635 - time (sec): 59.16 - samples/sec: 259.65 - lr: 0.000092 - momentum: 0.000000 |
|
2023-10-06 13:59:23,761 epoch 5 - iter 180/304 - loss 0.06571929 - time (sec): 71.20 - samples/sec: 263.56 - lr: 0.000090 - momentum: 0.000000 |
|
2023-10-06 13:59:35,332 epoch 5 - iter 210/304 - loss 0.07334793 - time (sec): 82.77 - samples/sec: 265.85 - lr: 0.000089 - momentum: 0.000000 |
|
2023-10-06 13:59:46,411 epoch 5 - iter 240/304 - loss 0.07363936 - time (sec): 93.84 - samples/sec: 264.91 - lr: 0.000087 - momentum: 0.000000 |
|
2023-10-06 13:59:57,338 epoch 5 - iter 270/304 - loss 0.07040085 - time (sec): 104.77 - samples/sec: 265.87 - lr: 0.000085 - momentum: 0.000000 |
|
2023-10-06 14:00:08,091 epoch 5 - iter 300/304 - loss 0.06974441 - time (sec): 115.52 - samples/sec: 264.84 - lr: 0.000084 - momentum: 0.000000 |
|
2023-10-06 14:00:09,526 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:00:09,527 EPOCH 5 done: loss 0.0699 - lr: 0.000084 |
|
2023-10-06 14:00:16,696 DEV : loss 0.14341147243976593 - f1-score (micro avg) 0.8248 |
|
2023-10-06 14:00:16,704 saving best model |
|
2023-10-06 14:00:21,047 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:00:32,265 epoch 6 - iter 30/304 - loss 0.02623539 - time (sec): 11.22 - samples/sec: 262.67 - lr: 0.000082 - momentum: 0.000000 |
|
2023-10-06 14:00:43,310 epoch 6 - iter 60/304 - loss 0.03962068 - time (sec): 22.26 - samples/sec: 262.25 - lr: 0.000080 - momentum: 0.000000 |
|
2023-10-06 14:00:54,301 epoch 6 - iter 90/304 - loss 0.03772800 - time (sec): 33.25 - samples/sec: 262.87 - lr: 0.000079 - momentum: 0.000000 |
|
2023-10-06 14:01:05,816 epoch 6 - iter 120/304 - loss 0.04396287 - time (sec): 44.77 - samples/sec: 265.85 - lr: 0.000077 - momentum: 0.000000 |
|
2023-10-06 14:01:17,276 epoch 6 - iter 150/304 - loss 0.04175421 - time (sec): 56.23 - samples/sec: 267.88 - lr: 0.000075 - momentum: 0.000000 |
|
2023-10-06 14:01:28,597 epoch 6 - iter 180/304 - loss 0.04470715 - time (sec): 67.55 - samples/sec: 269.24 - lr: 0.000074 - momentum: 0.000000 |
|
2023-10-06 14:01:39,991 epoch 6 - iter 210/304 - loss 0.05032634 - time (sec): 78.94 - samples/sec: 268.97 - lr: 0.000072 - momentum: 0.000000 |
|
2023-10-06 14:01:51,315 epoch 6 - iter 240/304 - loss 0.05207583 - time (sec): 90.27 - samples/sec: 268.03 - lr: 0.000070 - momentum: 0.000000 |
|
2023-10-06 14:02:02,641 epoch 6 - iter 270/304 - loss 0.05097501 - time (sec): 101.59 - samples/sec: 267.80 - lr: 0.000069 - momentum: 0.000000 |
|
2023-10-06 14:02:14,633 epoch 6 - iter 300/304 - loss 0.04938371 - time (sec): 113.58 - samples/sec: 268.22 - lr: 0.000067 - momentum: 0.000000 |
|
2023-10-06 14:02:16,312 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:02:16,312 EPOCH 6 done: loss 0.0509 - lr: 0.000067 |
|
2023-10-06 14:02:23,557 DEV : loss 0.15129701793193817 - f1-score (micro avg) 0.8168 |
|
2023-10-06 14:02:23,565 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:02:35,815 epoch 7 - iter 30/304 - loss 0.06289487 - time (sec): 12.25 - samples/sec: 282.64 - lr: 0.000065 - momentum: 0.000000 |
|
2023-10-06 14:02:47,326 epoch 7 - iter 60/304 - loss 0.04617587 - time (sec): 23.76 - samples/sec: 270.59 - lr: 0.000063 - momentum: 0.000000 |
|
2023-10-06 14:02:58,483 epoch 7 - iter 90/304 - loss 0.04472496 - time (sec): 34.92 - samples/sec: 271.39 - lr: 0.000062 - momentum: 0.000000 |
|
2023-10-06 14:03:09,386 epoch 7 - iter 120/304 - loss 0.04400412 - time (sec): 45.82 - samples/sec: 264.86 - lr: 0.000060 - momentum: 0.000000 |
|
2023-10-06 14:03:21,173 epoch 7 - iter 150/304 - loss 0.04300343 - time (sec): 57.61 - samples/sec: 266.06 - lr: 0.000059 - momentum: 0.000000 |
|
2023-10-06 14:03:32,273 epoch 7 - iter 180/304 - loss 0.04115047 - time (sec): 68.71 - samples/sec: 265.65 - lr: 0.000057 - momentum: 0.000000 |
|
2023-10-06 14:03:43,573 epoch 7 - iter 210/304 - loss 0.03805988 - time (sec): 80.01 - samples/sec: 266.25 - lr: 0.000055 - momentum: 0.000000 |
|
2023-10-06 14:03:54,556 epoch 7 - iter 240/304 - loss 0.04011790 - time (sec): 90.99 - samples/sec: 266.64 - lr: 0.000054 - momentum: 0.000000 |
|
2023-10-06 14:04:05,989 epoch 7 - iter 270/304 - loss 0.03779151 - time (sec): 102.42 - samples/sec: 268.40 - lr: 0.000052 - momentum: 0.000000 |
|
2023-10-06 14:04:17,392 epoch 7 - iter 300/304 - loss 0.04004340 - time (sec): 113.83 - samples/sec: 269.36 - lr: 0.000050 - momentum: 0.000000 |
|
2023-10-06 14:04:18,732 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:04:18,733 EPOCH 7 done: loss 0.0397 - lr: 0.000050 |
|
2023-10-06 14:04:25,703 DEV : loss 0.15146459639072418 - f1-score (micro avg) 0.8363 |
|
2023-10-06 14:04:25,709 saving best model |
|
2023-10-06 14:04:30,644 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:04:41,940 epoch 8 - iter 30/304 - loss 0.02719666 - time (sec): 11.29 - samples/sec: 270.31 - lr: 0.000048 - momentum: 0.000000 |
|
2023-10-06 14:04:53,662 epoch 8 - iter 60/304 - loss 0.04801639 - time (sec): 23.02 - samples/sec: 270.94 - lr: 0.000047 - momentum: 0.000000 |
|
2023-10-06 14:05:05,160 epoch 8 - iter 90/304 - loss 0.04127764 - time (sec): 34.51 - samples/sec: 268.73 - lr: 0.000045 - momentum: 0.000000 |
|
2023-10-06 14:05:17,186 epoch 8 - iter 120/304 - loss 0.03670424 - time (sec): 46.54 - samples/sec: 271.67 - lr: 0.000044 - momentum: 0.000000 |
|
2023-10-06 14:05:28,726 epoch 8 - iter 150/304 - loss 0.03719557 - time (sec): 58.08 - samples/sec: 269.90 - lr: 0.000042 - momentum: 0.000000 |
|
2023-10-06 14:05:39,507 epoch 8 - iter 180/304 - loss 0.03578934 - time (sec): 68.86 - samples/sec: 267.58 - lr: 0.000040 - momentum: 0.000000 |
|
2023-10-06 14:05:50,867 epoch 8 - iter 210/304 - loss 0.03645765 - time (sec): 80.22 - samples/sec: 266.55 - lr: 0.000039 - momentum: 0.000000 |
|
2023-10-06 14:06:02,158 epoch 8 - iter 240/304 - loss 0.03520077 - time (sec): 91.51 - samples/sec: 265.68 - lr: 0.000037 - momentum: 0.000000 |
|
2023-10-06 14:06:13,630 epoch 8 - iter 270/304 - loss 0.03558991 - time (sec): 102.99 - samples/sec: 266.30 - lr: 0.000035 - momentum: 0.000000 |
|
2023-10-06 14:06:25,164 epoch 8 - iter 300/304 - loss 0.03379787 - time (sec): 114.52 - samples/sec: 267.12 - lr: 0.000034 - momentum: 0.000000 |
|
2023-10-06 14:06:26,625 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:06:26,626 EPOCH 8 done: loss 0.0334 - lr: 0.000034 |
|
2023-10-06 14:06:33,854 DEV : loss 0.15895652770996094 - f1-score (micro avg) 0.8394 |
|
2023-10-06 14:06:33,863 saving best model |
|
2023-10-06 14:06:38,207 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:06:50,183 epoch 9 - iter 30/304 - loss 0.01941420 - time (sec): 11.97 - samples/sec: 277.16 - lr: 0.000032 - momentum: 0.000000 |
|
2023-10-06 14:07:01,965 epoch 9 - iter 60/304 - loss 0.01719760 - time (sec): 23.76 - samples/sec: 274.54 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-06 14:07:12,787 epoch 9 - iter 90/304 - loss 0.02224460 - time (sec): 34.58 - samples/sec: 269.35 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-06 14:07:24,398 epoch 9 - iter 120/304 - loss 0.02123394 - time (sec): 46.19 - samples/sec: 270.45 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-06 14:07:35,929 epoch 9 - iter 150/304 - loss 0.02662022 - time (sec): 57.72 - samples/sec: 271.77 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-06 14:07:47,314 epoch 9 - iter 180/304 - loss 0.02768240 - time (sec): 69.11 - samples/sec: 271.40 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-06 14:07:58,979 epoch 9 - iter 210/304 - loss 0.02858735 - time (sec): 80.77 - samples/sec: 271.52 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-06 14:08:09,996 epoch 9 - iter 240/304 - loss 0.02872271 - time (sec): 91.79 - samples/sec: 269.15 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-06 14:08:21,445 epoch 9 - iter 270/304 - loss 0.02886801 - time (sec): 103.24 - samples/sec: 268.59 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-06 14:08:32,602 epoch 9 - iter 300/304 - loss 0.02840959 - time (sec): 114.39 - samples/sec: 267.22 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-06 14:08:34,332 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:08:34,333 EPOCH 9 done: loss 0.0281 - lr: 0.000017 |
|
2023-10-06 14:08:41,592 DEV : loss 0.16345614194869995 - f1-score (micro avg) 0.8329 |
|
2023-10-06 14:08:41,600 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:08:52,926 epoch 10 - iter 30/304 - loss 0.02937562 - time (sec): 11.32 - samples/sec: 264.20 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-06 14:09:04,560 epoch 10 - iter 60/304 - loss 0.02590763 - time (sec): 22.96 - samples/sec: 263.30 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-06 14:09:16,048 epoch 10 - iter 90/304 - loss 0.02337875 - time (sec): 34.45 - samples/sec: 263.24 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-06 14:09:27,723 epoch 10 - iter 120/304 - loss 0.02330470 - time (sec): 46.12 - samples/sec: 260.35 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-06 14:09:39,207 epoch 10 - iter 150/304 - loss 0.02030588 - time (sec): 57.61 - samples/sec: 256.01 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-06 14:09:51,478 epoch 10 - iter 180/304 - loss 0.02550666 - time (sec): 69.88 - samples/sec: 256.38 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-06 14:10:03,234 epoch 10 - iter 210/304 - loss 0.02322805 - time (sec): 81.63 - samples/sec: 255.95 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-06 14:10:15,617 epoch 10 - iter 240/304 - loss 0.02352220 - time (sec): 94.02 - samples/sec: 256.63 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-06 14:10:27,850 epoch 10 - iter 270/304 - loss 0.02250839 - time (sec): 106.25 - samples/sec: 258.10 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-06 14:10:40,226 epoch 10 - iter 300/304 - loss 0.02307207 - time (sec): 118.63 - samples/sec: 258.27 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-06 14:10:41,654 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:10:41,655 EPOCH 10 done: loss 0.0243 - lr: 0.000000 |
|
2023-10-06 14:10:49,733 DEV : loss 0.166978120803833 - f1-score (micro avg) 0.8306 |
|
2023-10-06 14:10:50,632 ---------------------------------------------------------------------------------------------------- |
|
2023-10-06 14:10:50,634 Loading model from best epoch ... |
|
2023-10-06 14:10:53,499 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-date, B-date, E-date, I-date, S-object, B-object, E-object, I-object |
|
2023-10-06 14:11:00,820 |
|
Results: |
|
- F-score (micro) 0.8 |
|
- F-score (macro) 0.6446 |
|
- Accuracy 0.6712 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
scope 0.7580 0.7881 0.7727 151 |
|
pers 0.7712 0.9479 0.8505 96 |
|
work 0.7455 0.8632 0.8000 95 |
|
loc 1.0000 0.6667 0.8000 3 |
|
date 0.0000 0.0000 0.0000 3 |
|
|
|
micro avg 0.7597 0.8448 0.8000 348 |
|
macro avg 0.6549 0.6532 0.6446 348 |
|
weighted avg 0.7537 0.8448 0.7952 348 |
|
|
|
2023-10-06 14:11:00,820 ---------------------------------------------------------------------------------------------------- |
|
|