2023-10-17 11:52:54,786 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,787 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): ElectraModel( (embeddings): ElectraEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): ElectraEncoder( (layer): ModuleList( (0-11): 12 x ElectraLayer( (attention): ElectraAttention( (self): ElectraSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): ElectraSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): ElectraIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): ElectraOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=13, bias=True) (loss_function): CrossEntropyLoss() )" 2023-10-17 11:52:54,787 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,787 MultiCorpus: 14465 train + 1392 dev + 2432 test sentences - NER_HIPE_2022 Corpus: 14465 train + 1392 dev + 2432 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/letemps/fr/with_doc_seperator 2023-10-17 11:52:54,787 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,787 Train: 14465 sentences 2023-10-17 11:52:54,788 (train_with_dev=False, train_with_test=False) 2023-10-17 11:52:54,788 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,788 Training Params: 2023-10-17 11:52:54,788 - learning_rate: "5e-05" 2023-10-17 11:52:54,788 - mini_batch_size: "4" 2023-10-17 11:52:54,788 - max_epochs: "10" 2023-10-17 11:52:54,788 - shuffle: "True" 2023-10-17 11:52:54,788 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,788 Plugins: 2023-10-17 11:52:54,788 - TensorboardLogger 2023-10-17 11:52:54,788 - LinearScheduler | warmup_fraction: '0.1' 2023-10-17 11:52:54,788 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,788 Final evaluation on model from best epoch (best-model.pt) 2023-10-17 11:52:54,788 - metric: "('micro avg', 'f1-score')" 2023-10-17 11:52:54,788 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,788 Computation: 2023-10-17 11:52:54,789 - compute on device: cuda:0 2023-10-17 11:52:54,789 - embedding storage: none 2023-10-17 11:52:54,789 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,789 Model training base path: "hmbench-letemps/fr-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2" 2023-10-17 11:52:54,789 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,789 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:52:54,789 Logging anything other than scalars to TensorBoard is currently not supported. 2023-10-17 11:53:17,101 epoch 1 - iter 361/3617 - loss 1.45041553 - time (sec): 22.31 - samples/sec: 1698.14 - lr: 0.000005 - momentum: 0.000000 2023-10-17 11:53:38,886 epoch 1 - iter 722/3617 - loss 0.83716082 - time (sec): 44.10 - samples/sec: 1686.06 - lr: 0.000010 - momentum: 0.000000 2023-10-17 11:54:02,640 epoch 1 - iter 1083/3617 - loss 0.59781487 - time (sec): 67.85 - samples/sec: 1681.62 - lr: 0.000015 - momentum: 0.000000 2023-10-17 11:54:24,936 epoch 1 - iter 1444/3617 - loss 0.48149327 - time (sec): 90.15 - samples/sec: 1694.00 - lr: 0.000020 - momentum: 0.000000 2023-10-17 11:54:47,775 epoch 1 - iter 1805/3617 - loss 0.41319463 - time (sec): 112.98 - samples/sec: 1690.70 - lr: 0.000025 - momentum: 0.000000 2023-10-17 11:55:11,378 epoch 1 - iter 2166/3617 - loss 0.36558875 - time (sec): 136.59 - samples/sec: 1672.75 - lr: 0.000030 - momentum: 0.000000 2023-10-17 11:55:35,040 epoch 1 - iter 2527/3617 - loss 0.32938220 - time (sec): 160.25 - samples/sec: 1663.32 - lr: 0.000035 - momentum: 0.000000 2023-10-17 11:55:59,195 epoch 1 - iter 2888/3617 - loss 0.30249835 - time (sec): 184.40 - samples/sec: 1658.74 - lr: 0.000040 - momentum: 0.000000 2023-10-17 11:56:22,092 epoch 1 - iter 3249/3617 - loss 0.28225761 - time (sec): 207.30 - samples/sec: 1657.19 - lr: 0.000045 - momentum: 0.000000 2023-10-17 11:56:45,459 epoch 1 - iter 3610/3617 - loss 0.26717410 - time (sec): 230.67 - samples/sec: 1644.02 - lr: 0.000050 - momentum: 0.000000 2023-10-17 11:56:45,909 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:56:45,909 EPOCH 1 done: loss 0.2669 - lr: 0.000050 2023-10-17 11:56:51,334 DEV : loss 0.11786916851997375 - f1-score (micro avg) 0.5019 2023-10-17 11:56:51,374 saving best model 2023-10-17 11:56:51,877 ---------------------------------------------------------------------------------------------------- 2023-10-17 11:57:15,292 epoch 2 - iter 361/3617 - loss 0.10557221 - time (sec): 23.41 - samples/sec: 1662.15 - lr: 0.000049 - momentum: 0.000000 2023-10-17 11:57:37,011 epoch 2 - iter 722/3617 - loss 0.10278944 - time (sec): 45.13 - samples/sec: 1710.38 - lr: 0.000049 - momentum: 0.000000 2023-10-17 11:57:58,661 epoch 2 - iter 1083/3617 - loss 0.10655278 - time (sec): 66.78 - samples/sec: 1709.09 - lr: 0.000048 - momentum: 0.000000 2023-10-17 11:58:21,263 epoch 2 - iter 1444/3617 - loss 0.10643025 - time (sec): 89.38 - samples/sec: 1696.68 - lr: 0.000048 - momentum: 0.000000 2023-10-17 11:58:44,442 epoch 2 - iter 1805/3617 - loss 0.10457147 - time (sec): 112.56 - samples/sec: 1674.98 - lr: 0.000047 - momentum: 0.000000 2023-10-17 11:59:07,714 epoch 2 - iter 2166/3617 - loss 0.10556253 - time (sec): 135.84 - samples/sec: 1671.06 - lr: 0.000047 - momentum: 0.000000 2023-10-17 11:59:30,682 epoch 2 - iter 2527/3617 - loss 0.10440273 - time (sec): 158.80 - samples/sec: 1673.80 - lr: 0.000046 - momentum: 0.000000 2023-10-17 11:59:51,173 epoch 2 - iter 2888/3617 - loss 0.10398260 - time (sec): 179.29 - samples/sec: 1692.95 - lr: 0.000046 - momentum: 0.000000 2023-10-17 12:00:09,116 epoch 2 - iter 3249/3617 - loss 0.10460700 - time (sec): 197.24 - samples/sec: 1736.35 - lr: 0.000045 - momentum: 0.000000 2023-10-17 12:00:30,262 epoch 2 - iter 3610/3617 - loss 0.10618003 - time (sec): 218.38 - samples/sec: 1736.86 - lr: 0.000044 - momentum: 0.000000 2023-10-17 12:00:30,682 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:00:30,682 EPOCH 2 done: loss 0.1063 - lr: 0.000044 2023-10-17 12:00:37,743 DEV : loss 0.15698249638080597 - f1-score (micro avg) 0.6292 2023-10-17 12:00:37,792 saving best model 2023-10-17 12:00:38,379 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:01:00,850 epoch 3 - iter 361/3617 - loss 0.08022559 - time (sec): 22.47 - samples/sec: 1636.31 - lr: 0.000044 - momentum: 0.000000 2023-10-17 12:01:23,625 epoch 3 - iter 722/3617 - loss 0.08069320 - time (sec): 45.24 - samples/sec: 1663.23 - lr: 0.000043 - momentum: 0.000000 2023-10-17 12:01:46,101 epoch 3 - iter 1083/3617 - loss 0.08181083 - time (sec): 67.72 - samples/sec: 1672.77 - lr: 0.000043 - momentum: 0.000000 2023-10-17 12:02:09,383 epoch 3 - iter 1444/3617 - loss 0.08325852 - time (sec): 91.00 - samples/sec: 1661.25 - lr: 0.000042 - momentum: 0.000000 2023-10-17 12:02:27,763 epoch 3 - iter 1805/3617 - loss 0.08237466 - time (sec): 109.38 - samples/sec: 1726.56 - lr: 0.000042 - momentum: 0.000000 2023-10-17 12:02:49,091 epoch 3 - iter 2166/3617 - loss 0.08389650 - time (sec): 130.71 - samples/sec: 1745.30 - lr: 0.000041 - momentum: 0.000000 2023-10-17 12:03:11,251 epoch 3 - iter 2527/3617 - loss 0.08517465 - time (sec): 152.87 - samples/sec: 1731.56 - lr: 0.000041 - momentum: 0.000000 2023-10-17 12:03:33,655 epoch 3 - iter 2888/3617 - loss 0.08551924 - time (sec): 175.27 - samples/sec: 1728.77 - lr: 0.000040 - momentum: 0.000000 2023-10-17 12:03:56,927 epoch 3 - iter 3249/3617 - loss 0.08551615 - time (sec): 198.55 - samples/sec: 1722.72 - lr: 0.000039 - momentum: 0.000000 2023-10-17 12:04:19,219 epoch 3 - iter 3610/3617 - loss 0.08533098 - time (sec): 220.84 - samples/sec: 1717.15 - lr: 0.000039 - momentum: 0.000000 2023-10-17 12:04:19,653 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:04:19,654 EPOCH 3 done: loss 0.0853 - lr: 0.000039 2023-10-17 12:04:26,008 DEV : loss 0.20511414110660553 - f1-score (micro avg) 0.6161 2023-10-17 12:04:26,049 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:04:49,116 epoch 4 - iter 361/3617 - loss 0.05873537 - time (sec): 23.07 - samples/sec: 1674.62 - lr: 0.000038 - momentum: 0.000000 2023-10-17 12:05:12,376 epoch 4 - iter 722/3617 - loss 0.06004159 - time (sec): 46.33 - samples/sec: 1645.73 - lr: 0.000038 - momentum: 0.000000 2023-10-17 12:05:35,066 epoch 4 - iter 1083/3617 - loss 0.06324927 - time (sec): 69.02 - samples/sec: 1673.37 - lr: 0.000037 - momentum: 0.000000 2023-10-17 12:05:57,755 epoch 4 - iter 1444/3617 - loss 0.06295022 - time (sec): 91.70 - samples/sec: 1671.02 - lr: 0.000037 - momentum: 0.000000 2023-10-17 12:06:20,466 epoch 4 - iter 1805/3617 - loss 0.06317630 - time (sec): 114.42 - samples/sec: 1668.82 - lr: 0.000036 - momentum: 0.000000 2023-10-17 12:06:42,695 epoch 4 - iter 2166/3617 - loss 0.06379107 - time (sec): 136.64 - samples/sec: 1673.67 - lr: 0.000036 - momentum: 0.000000 2023-10-17 12:07:04,646 epoch 4 - iter 2527/3617 - loss 0.06397835 - time (sec): 158.60 - samples/sec: 1681.56 - lr: 0.000035 - momentum: 0.000000 2023-10-17 12:07:27,971 epoch 4 - iter 2888/3617 - loss 0.06348311 - time (sec): 181.92 - samples/sec: 1678.47 - lr: 0.000034 - momentum: 0.000000 2023-10-17 12:07:49,326 epoch 4 - iter 3249/3617 - loss 0.06347814 - time (sec): 203.28 - samples/sec: 1686.73 - lr: 0.000034 - momentum: 0.000000 2023-10-17 12:08:11,036 epoch 4 - iter 3610/3617 - loss 0.06347373 - time (sec): 224.98 - samples/sec: 1686.43 - lr: 0.000033 - momentum: 0.000000 2023-10-17 12:08:11,436 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:08:11,436 EPOCH 4 done: loss 0.0634 - lr: 0.000033 2023-10-17 12:08:18,597 DEV : loss 0.22120679914951324 - f1-score (micro avg) 0.6187 2023-10-17 12:08:18,639 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:08:42,405 epoch 5 - iter 361/3617 - loss 0.03776153 - time (sec): 23.77 - samples/sec: 1596.80 - lr: 0.000033 - momentum: 0.000000 2023-10-17 12:09:05,646 epoch 5 - iter 722/3617 - loss 0.04135391 - time (sec): 47.01 - samples/sec: 1629.07 - lr: 0.000032 - momentum: 0.000000 2023-10-17 12:09:28,421 epoch 5 - iter 1083/3617 - loss 0.04270991 - time (sec): 69.78 - samples/sec: 1637.87 - lr: 0.000032 - momentum: 0.000000 2023-10-17 12:09:51,769 epoch 5 - iter 1444/3617 - loss 0.04260540 - time (sec): 93.13 - samples/sec: 1635.70 - lr: 0.000031 - momentum: 0.000000 2023-10-17 12:10:14,299 epoch 5 - iter 1805/3617 - loss 0.04110131 - time (sec): 115.66 - samples/sec: 1656.86 - lr: 0.000031 - momentum: 0.000000 2023-10-17 12:10:36,976 epoch 5 - iter 2166/3617 - loss 0.04177018 - time (sec): 138.34 - samples/sec: 1668.53 - lr: 0.000030 - momentum: 0.000000 2023-10-17 12:10:59,264 epoch 5 - iter 2527/3617 - loss 0.04324074 - time (sec): 160.62 - samples/sec: 1663.73 - lr: 0.000029 - momentum: 0.000000 2023-10-17 12:11:21,101 epoch 5 - iter 2888/3617 - loss 0.04411956 - time (sec): 182.46 - samples/sec: 1658.59 - lr: 0.000029 - momentum: 0.000000 2023-10-17 12:11:45,148 epoch 5 - iter 3249/3617 - loss 0.04348395 - time (sec): 206.51 - samples/sec: 1652.71 - lr: 0.000028 - momentum: 0.000000 2023-10-17 12:12:07,783 epoch 5 - iter 3610/3617 - loss 0.04478114 - time (sec): 229.14 - samples/sec: 1654.40 - lr: 0.000028 - momentum: 0.000000 2023-10-17 12:12:08,198 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:12:08,198 EPOCH 5 done: loss 0.0447 - lr: 0.000028 2023-10-17 12:12:14,607 DEV : loss 0.2750839591026306 - f1-score (micro avg) 0.6418 2023-10-17 12:12:14,652 saving best model 2023-10-17 12:12:15,246 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:12:37,546 epoch 6 - iter 361/3617 - loss 0.02896062 - time (sec): 22.30 - samples/sec: 1678.63 - lr: 0.000027 - momentum: 0.000000 2023-10-17 12:13:01,122 epoch 6 - iter 722/3617 - loss 0.02877240 - time (sec): 45.87 - samples/sec: 1629.00 - lr: 0.000027 - momentum: 0.000000 2023-10-17 12:13:23,419 epoch 6 - iter 1083/3617 - loss 0.03088311 - time (sec): 68.17 - samples/sec: 1654.59 - lr: 0.000026 - momentum: 0.000000 2023-10-17 12:13:45,707 epoch 6 - iter 1444/3617 - loss 0.03181763 - time (sec): 90.46 - samples/sec: 1672.21 - lr: 0.000026 - momentum: 0.000000 2023-10-17 12:14:07,684 epoch 6 - iter 1805/3617 - loss 0.03280478 - time (sec): 112.44 - samples/sec: 1689.30 - lr: 0.000025 - momentum: 0.000000 2023-10-17 12:14:30,091 epoch 6 - iter 2166/3617 - loss 0.03257395 - time (sec): 134.84 - samples/sec: 1691.26 - lr: 0.000024 - momentum: 0.000000 2023-10-17 12:14:52,451 epoch 6 - iter 2527/3617 - loss 0.03273840 - time (sec): 157.20 - samples/sec: 1693.80 - lr: 0.000024 - momentum: 0.000000 2023-10-17 12:15:16,751 epoch 6 - iter 2888/3617 - loss 0.03337305 - time (sec): 181.50 - samples/sec: 1675.04 - lr: 0.000023 - momentum: 0.000000 2023-10-17 12:15:39,289 epoch 6 - iter 3249/3617 - loss 0.03410165 - time (sec): 204.04 - samples/sec: 1671.99 - lr: 0.000023 - momentum: 0.000000 2023-10-17 12:15:59,015 epoch 6 - iter 3610/3617 - loss 0.03367272 - time (sec): 223.77 - samples/sec: 1695.44 - lr: 0.000022 - momentum: 0.000000 2023-10-17 12:15:59,460 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:15:59,460 EPOCH 6 done: loss 0.0337 - lr: 0.000022 2023-10-17 12:16:06,723 DEV : loss 0.3224092423915863 - f1-score (micro avg) 0.6278 2023-10-17 12:16:06,764 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:16:30,012 epoch 7 - iter 361/3617 - loss 0.01207785 - time (sec): 23.25 - samples/sec: 1632.13 - lr: 0.000022 - momentum: 0.000000 2023-10-17 12:16:52,430 epoch 7 - iter 722/3617 - loss 0.01588108 - time (sec): 45.66 - samples/sec: 1649.57 - lr: 0.000021 - momentum: 0.000000 2023-10-17 12:17:14,472 epoch 7 - iter 1083/3617 - loss 0.01914681 - time (sec): 67.71 - samples/sec: 1671.24 - lr: 0.000021 - momentum: 0.000000 2023-10-17 12:17:38,387 epoch 7 - iter 1444/3617 - loss 0.02093559 - time (sec): 91.62 - samples/sec: 1648.96 - lr: 0.000020 - momentum: 0.000000 2023-10-17 12:18:00,039 epoch 7 - iter 1805/3617 - loss 0.02030467 - time (sec): 113.27 - samples/sec: 1670.47 - lr: 0.000019 - momentum: 0.000000 2023-10-17 12:18:22,176 epoch 7 - iter 2166/3617 - loss 0.02042856 - time (sec): 135.41 - samples/sec: 1675.92 - lr: 0.000019 - momentum: 0.000000 2023-10-17 12:18:45,119 epoch 7 - iter 2527/3617 - loss 0.01975550 - time (sec): 158.35 - samples/sec: 1676.74 - lr: 0.000018 - momentum: 0.000000 2023-10-17 12:19:04,588 epoch 7 - iter 2888/3617 - loss 0.01949421 - time (sec): 177.82 - samples/sec: 1705.58 - lr: 0.000018 - momentum: 0.000000 2023-10-17 12:19:22,591 epoch 7 - iter 3249/3617 - loss 0.01975215 - time (sec): 195.82 - samples/sec: 1746.76 - lr: 0.000017 - momentum: 0.000000 2023-10-17 12:19:44,496 epoch 7 - iter 3610/3617 - loss 0.02006605 - time (sec): 217.73 - samples/sec: 1742.14 - lr: 0.000017 - momentum: 0.000000 2023-10-17 12:19:44,901 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:19:44,901 EPOCH 7 done: loss 0.0202 - lr: 0.000017 2023-10-17 12:19:51,251 DEV : loss 0.317545086145401 - f1-score (micro avg) 0.6272 2023-10-17 12:19:51,295 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:20:13,563 epoch 8 - iter 361/3617 - loss 0.01412422 - time (sec): 22.27 - samples/sec: 1693.14 - lr: 0.000016 - momentum: 0.000000 2023-10-17 12:20:37,743 epoch 8 - iter 722/3617 - loss 0.01808065 - time (sec): 46.45 - samples/sec: 1612.50 - lr: 0.000016 - momentum: 0.000000 2023-10-17 12:21:02,137 epoch 8 - iter 1083/3617 - loss 0.01681718 - time (sec): 70.84 - samples/sec: 1590.05 - lr: 0.000015 - momentum: 0.000000 2023-10-17 12:21:26,146 epoch 8 - iter 1444/3617 - loss 0.01601246 - time (sec): 94.85 - samples/sec: 1600.06 - lr: 0.000014 - momentum: 0.000000 2023-10-17 12:21:49,292 epoch 8 - iter 1805/3617 - loss 0.01568364 - time (sec): 117.99 - samples/sec: 1600.39 - lr: 0.000014 - momentum: 0.000000 2023-10-17 12:22:11,848 epoch 8 - iter 2166/3617 - loss 0.01635340 - time (sec): 140.55 - samples/sec: 1609.35 - lr: 0.000013 - momentum: 0.000000 2023-10-17 12:22:33,797 epoch 8 - iter 2527/3617 - loss 0.01630735 - time (sec): 162.50 - samples/sec: 1622.79 - lr: 0.000013 - momentum: 0.000000 2023-10-17 12:22:55,815 epoch 8 - iter 2888/3617 - loss 0.01515567 - time (sec): 184.52 - samples/sec: 1637.90 - lr: 0.000012 - momentum: 0.000000 2023-10-17 12:23:18,342 epoch 8 - iter 3249/3617 - loss 0.01483412 - time (sec): 207.05 - samples/sec: 1648.57 - lr: 0.000012 - momentum: 0.000000 2023-10-17 12:23:41,314 epoch 8 - iter 3610/3617 - loss 0.01486930 - time (sec): 230.02 - samples/sec: 1648.51 - lr: 0.000011 - momentum: 0.000000 2023-10-17 12:23:41,781 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:23:41,781 EPOCH 8 done: loss 0.0148 - lr: 0.000011 2023-10-17 12:23:48,164 DEV : loss 0.3655739724636078 - f1-score (micro avg) 0.6602 2023-10-17 12:23:48,207 saving best model 2023-10-17 12:23:48,840 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:24:11,655 epoch 9 - iter 361/3617 - loss 0.00915400 - time (sec): 22.81 - samples/sec: 1667.62 - lr: 0.000011 - momentum: 0.000000 2023-10-17 12:24:34,394 epoch 9 - iter 722/3617 - loss 0.00828080 - time (sec): 45.55 - samples/sec: 1634.29 - lr: 0.000010 - momentum: 0.000000 2023-10-17 12:24:57,398 epoch 9 - iter 1083/3617 - loss 0.00838778 - time (sec): 68.56 - samples/sec: 1624.06 - lr: 0.000009 - momentum: 0.000000 2023-10-17 12:25:21,466 epoch 9 - iter 1444/3617 - loss 0.00854920 - time (sec): 92.62 - samples/sec: 1609.74 - lr: 0.000009 - momentum: 0.000000 2023-10-17 12:25:44,800 epoch 9 - iter 1805/3617 - loss 0.00847143 - time (sec): 115.96 - samples/sec: 1621.53 - lr: 0.000008 - momentum: 0.000000 2023-10-17 12:26:07,965 epoch 9 - iter 2166/3617 - loss 0.00871744 - time (sec): 139.12 - samples/sec: 1625.29 - lr: 0.000008 - momentum: 0.000000 2023-10-17 12:26:31,239 epoch 9 - iter 2527/3617 - loss 0.00823885 - time (sec): 162.40 - samples/sec: 1627.58 - lr: 0.000007 - momentum: 0.000000 2023-10-17 12:26:55,849 epoch 9 - iter 2888/3617 - loss 0.00773025 - time (sec): 187.01 - samples/sec: 1618.47 - lr: 0.000007 - momentum: 0.000000 2023-10-17 12:27:19,397 epoch 9 - iter 3249/3617 - loss 0.00760834 - time (sec): 210.55 - samples/sec: 1614.92 - lr: 0.000006 - momentum: 0.000000 2023-10-17 12:27:43,113 epoch 9 - iter 3610/3617 - loss 0.00769799 - time (sec): 234.27 - samples/sec: 1618.48 - lr: 0.000006 - momentum: 0.000000 2023-10-17 12:27:43,552 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:27:43,552 EPOCH 9 done: loss 0.0077 - lr: 0.000006 2023-10-17 12:27:49,962 DEV : loss 0.4163112938404083 - f1-score (micro avg) 0.6467 2023-10-17 12:27:50,004 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:28:13,013 epoch 10 - iter 361/3617 - loss 0.00413246 - time (sec): 23.01 - samples/sec: 1593.63 - lr: 0.000005 - momentum: 0.000000 2023-10-17 12:28:35,587 epoch 10 - iter 722/3617 - loss 0.00334843 - time (sec): 45.58 - samples/sec: 1657.68 - lr: 0.000004 - momentum: 0.000000 2023-10-17 12:28:58,768 epoch 10 - iter 1083/3617 - loss 0.00444223 - time (sec): 68.76 - samples/sec: 1625.52 - lr: 0.000004 - momentum: 0.000000 2023-10-17 12:29:22,388 epoch 10 - iter 1444/3617 - loss 0.00421888 - time (sec): 92.38 - samples/sec: 1628.53 - lr: 0.000003 - momentum: 0.000000 2023-10-17 12:29:45,256 epoch 10 - iter 1805/3617 - loss 0.00399761 - time (sec): 115.25 - samples/sec: 1631.71 - lr: 0.000003 - momentum: 0.000000 2023-10-17 12:30:07,935 epoch 10 - iter 2166/3617 - loss 0.00472994 - time (sec): 137.93 - samples/sec: 1641.01 - lr: 0.000002 - momentum: 0.000000 2023-10-17 12:30:31,147 epoch 10 - iter 2527/3617 - loss 0.00456748 - time (sec): 161.14 - samples/sec: 1631.95 - lr: 0.000002 - momentum: 0.000000 2023-10-17 12:30:54,698 epoch 10 - iter 2888/3617 - loss 0.00474495 - time (sec): 184.69 - samples/sec: 1636.72 - lr: 0.000001 - momentum: 0.000000 2023-10-17 12:31:16,709 epoch 10 - iter 3249/3617 - loss 0.00463207 - time (sec): 206.70 - samples/sec: 1652.97 - lr: 0.000001 - momentum: 0.000000 2023-10-17 12:31:40,217 epoch 10 - iter 3610/3617 - loss 0.00465381 - time (sec): 230.21 - samples/sec: 1647.46 - lr: 0.000000 - momentum: 0.000000 2023-10-17 12:31:40,665 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:31:40,665 EPOCH 10 done: loss 0.0046 - lr: 0.000000 2023-10-17 12:31:47,136 DEV : loss 0.43618330359458923 - f1-score (micro avg) 0.6512 2023-10-17 12:31:48,461 ---------------------------------------------------------------------------------------------------- 2023-10-17 12:31:48,463 Loading model from best epoch ... 2023-10-17 12:31:50,256 SequenceTagger predicts: Dictionary with 13 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org 2023-10-17 12:31:58,239 Results: - F-score (micro) 0.6485 - F-score (macro) 0.4908 - Accuracy 0.4915 By class: precision recall f1-score support loc 0.6662 0.7733 0.7157 591 pers 0.5293 0.7339 0.6150 357 org 0.2353 0.1013 0.1416 79 micro avg 0.5984 0.7079 0.6485 1027 macro avg 0.4769 0.5361 0.4908 1027 weighted avg 0.5855 0.7079 0.6366 1027 2023-10-17 12:31:58,239 ----------------------------------------------------------------------------------------------------