|
2023-10-17 08:23:49,889 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,890 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): ElectraModel( |
|
(embeddings): ElectraEmbeddings( |
|
(word_embeddings): Embedding(32001, 768) |
|
(position_embeddings): Embedding(512, 768) |
|
(token_type_embeddings): Embedding(2, 768) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): ElectraEncoder( |
|
(layer): ModuleList( |
|
(0-11): 12 x ElectraLayer( |
|
(attention): ElectraAttention( |
|
(self): ElectraSelfAttention( |
|
(query): Linear(in_features=768, out_features=768, bias=True) |
|
(key): Linear(in_features=768, out_features=768, bias=True) |
|
(value): Linear(in_features=768, out_features=768, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): ElectraSelfOutput( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): ElectraIntermediate( |
|
(dense): Linear(in_features=768, out_features=3072, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): ElectraOutput( |
|
(dense): Linear(in_features=3072, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=768, out_features=25, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-17 08:23:49,890 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,891 MultiCorpus: 1100 train + 206 dev + 240 test sentences |
|
- NER_HIPE_2022 Corpus: 1100 train + 206 dev + 240 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/ajmc/de/with_doc_seperator |
|
2023-10-17 08:23:49,891 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,891 Train: 1100 sentences |
|
2023-10-17 08:23:49,891 (train_with_dev=False, train_with_test=False) |
|
2023-10-17 08:23:49,891 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,891 Training Params: |
|
2023-10-17 08:23:49,891 - learning_rate: "3e-05" |
|
2023-10-17 08:23:49,891 - mini_batch_size: "8" |
|
2023-10-17 08:23:49,891 - max_epochs: "10" |
|
2023-10-17 08:23:49,891 - shuffle: "True" |
|
2023-10-17 08:23:49,891 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,891 Plugins: |
|
2023-10-17 08:23:49,891 - TensorboardLogger |
|
2023-10-17 08:23:49,891 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-17 08:23:49,891 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,891 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-17 08:23:49,891 - metric: "('micro avg', 'f1-score')" |
|
2023-10-17 08:23:49,891 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,891 Computation: |
|
2023-10-17 08:23:49,891 - compute on device: cuda:0 |
|
2023-10-17 08:23:49,891 - embedding storage: none |
|
2023-10-17 08:23:49,891 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,891 Model training base path: "hmbench-ajmc/de-hmteams/teams-base-historic-multilingual-discriminator-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1" |
|
2023-10-17 08:23:49,891 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,891 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:49,892 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-17 08:23:50,594 epoch 1 - iter 13/138 - loss 3.49809368 - time (sec): 0.70 - samples/sec: 2981.66 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-17 08:23:51,325 epoch 1 - iter 26/138 - loss 3.16010129 - time (sec): 1.43 - samples/sec: 2869.08 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-17 08:23:52,079 epoch 1 - iter 39/138 - loss 2.75377183 - time (sec): 2.19 - samples/sec: 2827.93 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-17 08:23:52,821 epoch 1 - iter 52/138 - loss 2.28687909 - time (sec): 2.93 - samples/sec: 2854.62 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-17 08:23:53,636 epoch 1 - iter 65/138 - loss 1.95114714 - time (sec): 3.74 - samples/sec: 2856.95 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-17 08:23:54,405 epoch 1 - iter 78/138 - loss 1.72964478 - time (sec): 4.51 - samples/sec: 2894.38 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-17 08:23:55,133 epoch 1 - iter 91/138 - loss 1.58551835 - time (sec): 5.24 - samples/sec: 2887.28 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-17 08:23:55,867 epoch 1 - iter 104/138 - loss 1.44460998 - time (sec): 5.97 - samples/sec: 2872.53 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-17 08:23:56,599 epoch 1 - iter 117/138 - loss 1.33447301 - time (sec): 6.71 - samples/sec: 2878.42 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-17 08:23:57,359 epoch 1 - iter 130/138 - loss 1.24505589 - time (sec): 7.47 - samples/sec: 2871.22 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-17 08:23:57,831 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:57,832 EPOCH 1 done: loss 1.1903 - lr: 0.000028 |
|
2023-10-17 08:23:58,635 DEV : loss 0.25952714681625366 - f1-score (micro avg) 0.5734 |
|
2023-10-17 08:23:58,640 saving best model |
|
2023-10-17 08:23:58,991 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:23:59,776 epoch 2 - iter 13/138 - loss 0.36122040 - time (sec): 0.78 - samples/sec: 2940.77 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-17 08:24:00,551 epoch 2 - iter 26/138 - loss 0.29793619 - time (sec): 1.56 - samples/sec: 2803.04 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-17 08:24:01,345 epoch 2 - iter 39/138 - loss 0.29683545 - time (sec): 2.35 - samples/sec: 2834.85 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-17 08:24:02,059 epoch 2 - iter 52/138 - loss 0.27965672 - time (sec): 3.07 - samples/sec: 2887.46 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-17 08:24:02,814 epoch 2 - iter 65/138 - loss 0.26522882 - time (sec): 3.82 - samples/sec: 2822.61 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-17 08:24:03,584 epoch 2 - iter 78/138 - loss 0.25845854 - time (sec): 4.59 - samples/sec: 2842.78 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-17 08:24:04,310 epoch 2 - iter 91/138 - loss 0.24532746 - time (sec): 5.32 - samples/sec: 2837.29 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-17 08:24:05,035 epoch 2 - iter 104/138 - loss 0.23679523 - time (sec): 6.04 - samples/sec: 2817.38 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-17 08:24:05,793 epoch 2 - iter 117/138 - loss 0.22779001 - time (sec): 6.80 - samples/sec: 2820.76 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-17 08:24:06,552 epoch 2 - iter 130/138 - loss 0.21987489 - time (sec): 7.56 - samples/sec: 2844.58 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-17 08:24:06,990 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:06,990 EPOCH 2 done: loss 0.2117 - lr: 0.000027 |
|
2023-10-17 08:24:07,641 DEV : loss 0.1423654556274414 - f1-score (micro avg) 0.821 |
|
2023-10-17 08:24:07,646 saving best model |
|
2023-10-17 08:24:08,104 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:08,847 epoch 3 - iter 13/138 - loss 0.11206862 - time (sec): 0.74 - samples/sec: 2751.90 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-17 08:24:09,603 epoch 3 - iter 26/138 - loss 0.12075396 - time (sec): 1.50 - samples/sec: 2966.69 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-17 08:24:10,359 epoch 3 - iter 39/138 - loss 0.11031964 - time (sec): 2.25 - samples/sec: 2838.36 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-17 08:24:11,070 epoch 3 - iter 52/138 - loss 0.10673496 - time (sec): 2.96 - samples/sec: 2812.99 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-17 08:24:11,916 epoch 3 - iter 65/138 - loss 0.10361395 - time (sec): 3.81 - samples/sec: 2804.42 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-17 08:24:12,631 epoch 3 - iter 78/138 - loss 0.10060112 - time (sec): 4.53 - samples/sec: 2797.79 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-17 08:24:13,374 epoch 3 - iter 91/138 - loss 0.10082389 - time (sec): 5.27 - samples/sec: 2851.39 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-17 08:24:14,117 epoch 3 - iter 104/138 - loss 0.10284591 - time (sec): 6.01 - samples/sec: 2865.92 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-17 08:24:14,844 epoch 3 - iter 117/138 - loss 0.10302089 - time (sec): 6.74 - samples/sec: 2870.62 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-17 08:24:15,564 epoch 3 - iter 130/138 - loss 0.11214884 - time (sec): 7.46 - samples/sec: 2894.73 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-17 08:24:16,008 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:16,009 EPOCH 3 done: loss 0.1101 - lr: 0.000024 |
|
2023-10-17 08:24:16,647 DEV : loss 0.12610985338687897 - f1-score (micro avg) 0.8235 |
|
2023-10-17 08:24:16,652 saving best model |
|
2023-10-17 08:24:17,087 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:17,830 epoch 4 - iter 13/138 - loss 0.05881840 - time (sec): 0.74 - samples/sec: 3208.55 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-17 08:24:18,531 epoch 4 - iter 26/138 - loss 0.06629479 - time (sec): 1.44 - samples/sec: 3022.27 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-17 08:24:19,259 epoch 4 - iter 39/138 - loss 0.06677456 - time (sec): 2.17 - samples/sec: 2991.15 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-17 08:24:20,005 epoch 4 - iter 52/138 - loss 0.06676999 - time (sec): 2.91 - samples/sec: 2924.52 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-17 08:24:20,771 epoch 4 - iter 65/138 - loss 0.07977619 - time (sec): 3.68 - samples/sec: 2966.27 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-17 08:24:21,479 epoch 4 - iter 78/138 - loss 0.07637301 - time (sec): 4.39 - samples/sec: 2938.14 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-17 08:24:22,202 epoch 4 - iter 91/138 - loss 0.07559118 - time (sec): 5.11 - samples/sec: 2939.35 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-17 08:24:22,960 epoch 4 - iter 104/138 - loss 0.07595665 - time (sec): 5.87 - samples/sec: 2941.79 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-17 08:24:23,715 epoch 4 - iter 117/138 - loss 0.07666906 - time (sec): 6.62 - samples/sec: 2936.63 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-17 08:24:24,440 epoch 4 - iter 130/138 - loss 0.07499893 - time (sec): 7.35 - samples/sec: 2916.68 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-17 08:24:24,933 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:24,934 EPOCH 4 done: loss 0.0750 - lr: 0.000020 |
|
2023-10-17 08:24:25,568 DEV : loss 0.1342494636774063 - f1-score (micro avg) 0.8682 |
|
2023-10-17 08:24:25,573 saving best model |
|
2023-10-17 08:24:26,043 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:26,805 epoch 5 - iter 13/138 - loss 0.08072435 - time (sec): 0.76 - samples/sec: 2888.21 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-17 08:24:27,541 epoch 5 - iter 26/138 - loss 0.07670776 - time (sec): 1.50 - samples/sec: 2979.35 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-17 08:24:28,299 epoch 5 - iter 39/138 - loss 0.05993711 - time (sec): 2.25 - samples/sec: 2912.87 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-17 08:24:29,007 epoch 5 - iter 52/138 - loss 0.07266534 - time (sec): 2.96 - samples/sec: 2929.15 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-17 08:24:29,736 epoch 5 - iter 65/138 - loss 0.06792529 - time (sec): 3.69 - samples/sec: 2885.36 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-17 08:24:30,479 epoch 5 - iter 78/138 - loss 0.06388406 - time (sec): 4.43 - samples/sec: 2845.55 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-17 08:24:31,200 epoch 5 - iter 91/138 - loss 0.06078525 - time (sec): 5.15 - samples/sec: 2884.15 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-17 08:24:31,989 epoch 5 - iter 104/138 - loss 0.05854605 - time (sec): 5.94 - samples/sec: 2885.72 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-17 08:24:32,737 epoch 5 - iter 117/138 - loss 0.06512071 - time (sec): 6.69 - samples/sec: 2915.97 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-17 08:24:33,453 epoch 5 - iter 130/138 - loss 0.06281187 - time (sec): 7.41 - samples/sec: 2911.31 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-17 08:24:33,933 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:33,934 EPOCH 5 done: loss 0.0607 - lr: 0.000017 |
|
2023-10-17 08:24:34,581 DEV : loss 0.14152704179286957 - f1-score (micro avg) 0.8568 |
|
2023-10-17 08:24:34,586 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:35,308 epoch 6 - iter 13/138 - loss 0.02968530 - time (sec): 0.72 - samples/sec: 2875.23 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-17 08:24:36,102 epoch 6 - iter 26/138 - loss 0.04733392 - time (sec): 1.51 - samples/sec: 3078.82 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-17 08:24:36,808 epoch 6 - iter 39/138 - loss 0.04456611 - time (sec): 2.22 - samples/sec: 3059.39 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-17 08:24:37,515 epoch 6 - iter 52/138 - loss 0.04191343 - time (sec): 2.93 - samples/sec: 2998.65 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-17 08:24:38,264 epoch 6 - iter 65/138 - loss 0.03634236 - time (sec): 3.68 - samples/sec: 2963.01 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-17 08:24:39,043 epoch 6 - iter 78/138 - loss 0.03869706 - time (sec): 4.46 - samples/sec: 2941.49 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-17 08:24:39,777 epoch 6 - iter 91/138 - loss 0.04330849 - time (sec): 5.19 - samples/sec: 2932.89 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-17 08:24:40,523 epoch 6 - iter 104/138 - loss 0.04643481 - time (sec): 5.94 - samples/sec: 2908.17 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-17 08:24:41,234 epoch 6 - iter 117/138 - loss 0.04267826 - time (sec): 6.65 - samples/sec: 2903.73 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-17 08:24:41,988 epoch 6 - iter 130/138 - loss 0.04498568 - time (sec): 7.40 - samples/sec: 2899.92 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-17 08:24:42,430 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:42,430 EPOCH 6 done: loss 0.0511 - lr: 0.000014 |
|
2023-10-17 08:24:43,072 DEV : loss 0.15315352380275726 - f1-score (micro avg) 0.8651 |
|
2023-10-17 08:24:43,077 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:43,776 epoch 7 - iter 13/138 - loss 0.02477330 - time (sec): 0.70 - samples/sec: 2750.58 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-17 08:24:44,504 epoch 7 - iter 26/138 - loss 0.03433648 - time (sec): 1.43 - samples/sec: 2827.47 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-17 08:24:45,246 epoch 7 - iter 39/138 - loss 0.05028591 - time (sec): 2.17 - samples/sec: 2834.75 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-17 08:24:46,031 epoch 7 - iter 52/138 - loss 0.04693637 - time (sec): 2.95 - samples/sec: 2792.13 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-17 08:24:46,811 epoch 7 - iter 65/138 - loss 0.04956954 - time (sec): 3.73 - samples/sec: 2844.28 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-17 08:24:47,575 epoch 7 - iter 78/138 - loss 0.04284061 - time (sec): 4.50 - samples/sec: 2816.74 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-17 08:24:48,289 epoch 7 - iter 91/138 - loss 0.04173745 - time (sec): 5.21 - samples/sec: 2829.99 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-17 08:24:49,019 epoch 7 - iter 104/138 - loss 0.03971604 - time (sec): 5.94 - samples/sec: 2850.45 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-17 08:24:49,809 epoch 7 - iter 117/138 - loss 0.03842371 - time (sec): 6.73 - samples/sec: 2864.69 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-17 08:24:50,545 epoch 7 - iter 130/138 - loss 0.03787472 - time (sec): 7.47 - samples/sec: 2872.71 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-17 08:24:51,067 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:51,067 EPOCH 7 done: loss 0.0371 - lr: 0.000010 |
|
2023-10-17 08:24:51,704 DEV : loss 0.16822752356529236 - f1-score (micro avg) 0.8691 |
|
2023-10-17 08:24:51,708 saving best model |
|
2023-10-17 08:24:52,153 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:24:52,907 epoch 8 - iter 13/138 - loss 0.03009041 - time (sec): 0.75 - samples/sec: 2792.32 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-17 08:24:53,660 epoch 8 - iter 26/138 - loss 0.04413822 - time (sec): 1.50 - samples/sec: 2898.69 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-17 08:24:54,389 epoch 8 - iter 39/138 - loss 0.04619894 - time (sec): 2.23 - samples/sec: 2929.53 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-17 08:24:55,144 epoch 8 - iter 52/138 - loss 0.04070750 - time (sec): 2.99 - samples/sec: 2919.08 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-17 08:24:55,937 epoch 8 - iter 65/138 - loss 0.04008572 - time (sec): 3.78 - samples/sec: 2898.93 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-17 08:24:56,665 epoch 8 - iter 78/138 - loss 0.03593628 - time (sec): 4.51 - samples/sec: 2871.59 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-17 08:24:57,405 epoch 8 - iter 91/138 - loss 0.03219739 - time (sec): 5.25 - samples/sec: 2903.86 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-17 08:24:58,149 epoch 8 - iter 104/138 - loss 0.03225186 - time (sec): 5.99 - samples/sec: 2879.86 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-17 08:24:58,877 epoch 8 - iter 117/138 - loss 0.03519688 - time (sec): 6.72 - samples/sec: 2894.29 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-17 08:24:59,631 epoch 8 - iter 130/138 - loss 0.03625943 - time (sec): 7.48 - samples/sec: 2893.82 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-17 08:25:00,043 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:25:00,043 EPOCH 8 done: loss 0.0352 - lr: 0.000007 |
|
2023-10-17 08:25:00,676 DEV : loss 0.18648800253868103 - f1-score (micro avg) 0.8666 |
|
2023-10-17 08:25:00,680 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:25:01,408 epoch 9 - iter 13/138 - loss 0.03893965 - time (sec): 0.73 - samples/sec: 2862.38 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-17 08:25:02,179 epoch 9 - iter 26/138 - loss 0.04375172 - time (sec): 1.50 - samples/sec: 2692.68 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-17 08:25:02,936 epoch 9 - iter 39/138 - loss 0.03344957 - time (sec): 2.25 - samples/sec: 2740.30 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-17 08:25:03,732 epoch 9 - iter 52/138 - loss 0.03076279 - time (sec): 3.05 - samples/sec: 2744.75 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-17 08:25:04,507 epoch 9 - iter 65/138 - loss 0.02960535 - time (sec): 3.83 - samples/sec: 2753.12 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-17 08:25:05,304 epoch 9 - iter 78/138 - loss 0.02789208 - time (sec): 4.62 - samples/sec: 2804.12 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-17 08:25:06,032 epoch 9 - iter 91/138 - loss 0.02960035 - time (sec): 5.35 - samples/sec: 2832.72 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-17 08:25:06,736 epoch 9 - iter 104/138 - loss 0.02787373 - time (sec): 6.05 - samples/sec: 2801.84 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-17 08:25:07,466 epoch 9 - iter 117/138 - loss 0.02791744 - time (sec): 6.78 - samples/sec: 2790.48 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-17 08:25:08,219 epoch 9 - iter 130/138 - loss 0.02832701 - time (sec): 7.54 - samples/sec: 2807.81 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-17 08:25:08,691 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:25:08,692 EPOCH 9 done: loss 0.0290 - lr: 0.000004 |
|
2023-10-17 08:25:09,498 DEV : loss 0.1811751425266266 - f1-score (micro avg) 0.873 |
|
2023-10-17 08:25:09,503 saving best model |
|
2023-10-17 08:25:09,958 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:25:10,708 epoch 10 - iter 13/138 - loss 0.02736013 - time (sec): 0.75 - samples/sec: 2783.95 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-17 08:25:11,467 epoch 10 - iter 26/138 - loss 0.01588076 - time (sec): 1.51 - samples/sec: 2687.45 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-17 08:25:12,193 epoch 10 - iter 39/138 - loss 0.02320217 - time (sec): 2.23 - samples/sec: 2756.03 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-17 08:25:12,995 epoch 10 - iter 52/138 - loss 0.01910415 - time (sec): 3.03 - samples/sec: 2712.81 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-17 08:25:13,769 epoch 10 - iter 65/138 - loss 0.01657925 - time (sec): 3.81 - samples/sec: 2753.39 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-17 08:25:14,521 epoch 10 - iter 78/138 - loss 0.02142734 - time (sec): 4.56 - samples/sec: 2764.95 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-17 08:25:15,307 epoch 10 - iter 91/138 - loss 0.02070130 - time (sec): 5.35 - samples/sec: 2793.49 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-17 08:25:16,062 epoch 10 - iter 104/138 - loss 0.02375901 - time (sec): 6.10 - samples/sec: 2805.40 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-17 08:25:16,787 epoch 10 - iter 117/138 - loss 0.02537797 - time (sec): 6.83 - samples/sec: 2837.20 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-17 08:25:17,515 epoch 10 - iter 130/138 - loss 0.02350727 - time (sec): 7.55 - samples/sec: 2868.95 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-17 08:25:17,944 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:25:17,944 EPOCH 10 done: loss 0.0236 - lr: 0.000000 |
|
2023-10-17 08:25:18,599 DEV : loss 0.18580147624015808 - f1-score (micro avg) 0.8696 |
|
2023-10-17 08:25:18,935 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 08:25:18,936 Loading model from best epoch ... |
|
2023-10-17 08:25:20,230 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date |
|
2023-10-17 08:25:20,863 |
|
Results: |
|
- F-score (micro) 0.9096 |
|
- F-score (macro) 0.7442 |
|
- Accuracy 0.8505 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
scope 0.8977 0.8977 0.8977 176 |
|
pers 0.9839 0.9531 0.9683 128 |
|
work 0.8333 0.8784 0.8553 74 |
|
loc 1.0000 1.0000 1.0000 2 |
|
object 0.0000 0.0000 0.0000 2 |
|
|
|
micro avg 0.9108 0.9084 0.9096 382 |
|
macro avg 0.7430 0.7458 0.7442 382 |
|
weighted avg 0.9100 0.9084 0.9090 382 |
|
|
|
2023-10-17 08:25:20,863 ---------------------------------------------------------------------------------------------------- |
|
|