stefan-it's picture
Upload folder using huggingface_hub
bff450d
2023-10-17 17:20:40,694 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,696 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): ElectraModel(
(embeddings): ElectraEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): ElectraEncoder(
(layer): ModuleList(
(0-11): 12 x ElectraLayer(
(attention): ElectraAttention(
(self): ElectraSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): ElectraSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): ElectraIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): ElectraOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=21, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-17 17:20:40,696 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,696 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences
- NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator
2023-10-17 17:20:40,697 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,697 Train: 3575 sentences
2023-10-17 17:20:40,697 (train_with_dev=False, train_with_test=False)
2023-10-17 17:20:40,697 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,697 Training Params:
2023-10-17 17:20:40,697 - learning_rate: "5e-05"
2023-10-17 17:20:40,697 - mini_batch_size: "4"
2023-10-17 17:20:40,697 - max_epochs: "10"
2023-10-17 17:20:40,697 - shuffle: "True"
2023-10-17 17:20:40,697 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,697 Plugins:
2023-10-17 17:20:40,697 - TensorboardLogger
2023-10-17 17:20:40,697 - LinearScheduler | warmup_fraction: '0.1'
2023-10-17 17:20:40,697 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,697 Final evaluation on model from best epoch (best-model.pt)
2023-10-17 17:20:40,698 - metric: "('micro avg', 'f1-score')"
2023-10-17 17:20:40,698 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,698 Computation:
2023-10-17 17:20:40,698 - compute on device: cuda:0
2023-10-17 17:20:40,698 - embedding storage: none
2023-10-17 17:20:40,698 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,698 Model training base path: "hmbench-hipe2020/de-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3"
2023-10-17 17:20:40,698 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,698 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:40,698 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-17 17:20:47,605 epoch 1 - iter 89/894 - loss 3.03990857 - time (sec): 6.91 - samples/sec: 1132.15 - lr: 0.000005 - momentum: 0.000000
2023-10-17 17:20:54,757 epoch 1 - iter 178/894 - loss 1.78520040 - time (sec): 14.06 - samples/sec: 1212.12 - lr: 0.000010 - momentum: 0.000000
2023-10-17 17:21:01,831 epoch 1 - iter 267/894 - loss 1.32800676 - time (sec): 21.13 - samples/sec: 1221.44 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:21:08,879 epoch 1 - iter 356/894 - loss 1.08814028 - time (sec): 28.18 - samples/sec: 1205.04 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:21:15,816 epoch 1 - iter 445/894 - loss 0.93203560 - time (sec): 35.12 - samples/sec: 1211.74 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:21:23,001 epoch 1 - iter 534/894 - loss 0.79925903 - time (sec): 42.30 - samples/sec: 1240.41 - lr: 0.000030 - momentum: 0.000000
2023-10-17 17:21:30,033 epoch 1 - iter 623/894 - loss 0.72337780 - time (sec): 49.33 - samples/sec: 1236.78 - lr: 0.000035 - momentum: 0.000000
2023-10-17 17:21:36,976 epoch 1 - iter 712/894 - loss 0.66747105 - time (sec): 56.28 - samples/sec: 1228.23 - lr: 0.000040 - momentum: 0.000000
2023-10-17 17:21:44,256 epoch 1 - iter 801/894 - loss 0.61270239 - time (sec): 63.56 - samples/sec: 1231.53 - lr: 0.000045 - momentum: 0.000000
2023-10-17 17:21:51,315 epoch 1 - iter 890/894 - loss 0.57797410 - time (sec): 70.62 - samples/sec: 1218.76 - lr: 0.000050 - momentum: 0.000000
2023-10-17 17:21:51,635 ----------------------------------------------------------------------------------------------------
2023-10-17 17:21:51,636 EPOCH 1 done: loss 0.5755 - lr: 0.000050
2023-10-17 17:21:57,988 DEV : loss 0.19460029900074005 - f1-score (micro avg) 0.6137
2023-10-17 17:21:58,044 saving best model
2023-10-17 17:21:58,580 ----------------------------------------------------------------------------------------------------
2023-10-17 17:22:05,697 epoch 2 - iter 89/894 - loss 0.16003986 - time (sec): 7.12 - samples/sec: 1204.20 - lr: 0.000049 - momentum: 0.000000
2023-10-17 17:22:12,777 epoch 2 - iter 178/894 - loss 0.15424274 - time (sec): 14.20 - samples/sec: 1203.33 - lr: 0.000049 - momentum: 0.000000
2023-10-17 17:22:19,836 epoch 2 - iter 267/894 - loss 0.14896442 - time (sec): 21.25 - samples/sec: 1174.60 - lr: 0.000048 - momentum: 0.000000
2023-10-17 17:22:26,808 epoch 2 - iter 356/894 - loss 0.15842510 - time (sec): 28.23 - samples/sec: 1144.80 - lr: 0.000048 - momentum: 0.000000
2023-10-17 17:22:34,036 epoch 2 - iter 445/894 - loss 0.15722699 - time (sec): 35.45 - samples/sec: 1179.79 - lr: 0.000047 - momentum: 0.000000
2023-10-17 17:22:41,249 epoch 2 - iter 534/894 - loss 0.16060261 - time (sec): 42.67 - samples/sec: 1202.85 - lr: 0.000047 - momentum: 0.000000
2023-10-17 17:22:48,061 epoch 2 - iter 623/894 - loss 0.16329369 - time (sec): 49.48 - samples/sec: 1207.00 - lr: 0.000046 - momentum: 0.000000
2023-10-17 17:22:54,996 epoch 2 - iter 712/894 - loss 0.16030192 - time (sec): 56.41 - samples/sec: 1218.15 - lr: 0.000046 - momentum: 0.000000
2023-10-17 17:23:01,984 epoch 2 - iter 801/894 - loss 0.15853588 - time (sec): 63.40 - samples/sec: 1232.82 - lr: 0.000045 - momentum: 0.000000
2023-10-17 17:23:08,791 epoch 2 - iter 890/894 - loss 0.15689717 - time (sec): 70.21 - samples/sec: 1227.17 - lr: 0.000044 - momentum: 0.000000
2023-10-17 17:23:09,089 ----------------------------------------------------------------------------------------------------
2023-10-17 17:23:09,089 EPOCH 2 done: loss 0.1579 - lr: 0.000044
2023-10-17 17:23:20,463 DEV : loss 0.13591983914375305 - f1-score (micro avg) 0.7137
2023-10-17 17:23:20,518 saving best model
2023-10-17 17:23:21,901 ----------------------------------------------------------------------------------------------------
2023-10-17 17:23:28,588 epoch 3 - iter 89/894 - loss 0.10586379 - time (sec): 6.68 - samples/sec: 1291.96 - lr: 0.000044 - momentum: 0.000000
2023-10-17 17:23:34,965 epoch 3 - iter 178/894 - loss 0.09310855 - time (sec): 13.06 - samples/sec: 1355.00 - lr: 0.000043 - momentum: 0.000000
2023-10-17 17:23:41,263 epoch 3 - iter 267/894 - loss 0.08343594 - time (sec): 19.36 - samples/sec: 1365.90 - lr: 0.000043 - momentum: 0.000000
2023-10-17 17:23:47,523 epoch 3 - iter 356/894 - loss 0.09356011 - time (sec): 25.62 - samples/sec: 1338.16 - lr: 0.000042 - momentum: 0.000000
2023-10-17 17:23:53,919 epoch 3 - iter 445/894 - loss 0.09659849 - time (sec): 32.01 - samples/sec: 1347.39 - lr: 0.000042 - momentum: 0.000000
2023-10-17 17:24:01,292 epoch 3 - iter 534/894 - loss 0.09801984 - time (sec): 39.39 - samples/sec: 1326.81 - lr: 0.000041 - momentum: 0.000000
2023-10-17 17:24:08,278 epoch 3 - iter 623/894 - loss 0.09757948 - time (sec): 46.37 - samples/sec: 1308.60 - lr: 0.000041 - momentum: 0.000000
2023-10-17 17:24:15,229 epoch 3 - iter 712/894 - loss 0.09967059 - time (sec): 53.32 - samples/sec: 1299.91 - lr: 0.000040 - momentum: 0.000000
2023-10-17 17:24:22,205 epoch 3 - iter 801/894 - loss 0.09645513 - time (sec): 60.30 - samples/sec: 1297.95 - lr: 0.000039 - momentum: 0.000000
2023-10-17 17:24:29,257 epoch 3 - iter 890/894 - loss 0.09586911 - time (sec): 67.35 - samples/sec: 1279.48 - lr: 0.000039 - momentum: 0.000000
2023-10-17 17:24:29,568 ----------------------------------------------------------------------------------------------------
2023-10-17 17:24:29,569 EPOCH 3 done: loss 0.0958 - lr: 0.000039
2023-10-17 17:24:41,377 DEV : loss 0.2212122529745102 - f1-score (micro avg) 0.7327
2023-10-17 17:24:41,434 saving best model
2023-10-17 17:24:42,026 ----------------------------------------------------------------------------------------------------
2023-10-17 17:24:49,211 epoch 4 - iter 89/894 - loss 0.06634274 - time (sec): 7.18 - samples/sec: 1268.33 - lr: 0.000038 - momentum: 0.000000
2023-10-17 17:24:56,183 epoch 4 - iter 178/894 - loss 0.06540443 - time (sec): 14.15 - samples/sec: 1234.94 - lr: 0.000038 - momentum: 0.000000
2023-10-17 17:25:03,579 epoch 4 - iter 267/894 - loss 0.05971285 - time (sec): 21.55 - samples/sec: 1213.46 - lr: 0.000037 - momentum: 0.000000
2023-10-17 17:25:10,710 epoch 4 - iter 356/894 - loss 0.06243303 - time (sec): 28.68 - samples/sec: 1201.41 - lr: 0.000037 - momentum: 0.000000
2023-10-17 17:25:17,864 epoch 4 - iter 445/894 - loss 0.06441040 - time (sec): 35.84 - samples/sec: 1202.25 - lr: 0.000036 - momentum: 0.000000
2023-10-17 17:25:25,005 epoch 4 - iter 534/894 - loss 0.06566270 - time (sec): 42.98 - samples/sec: 1204.48 - lr: 0.000036 - momentum: 0.000000
2023-10-17 17:25:32,357 epoch 4 - iter 623/894 - loss 0.06750837 - time (sec): 50.33 - samples/sec: 1192.38 - lr: 0.000035 - momentum: 0.000000
2023-10-17 17:25:39,440 epoch 4 - iter 712/894 - loss 0.06895087 - time (sec): 57.41 - samples/sec: 1187.88 - lr: 0.000034 - momentum: 0.000000
2023-10-17 17:25:46,721 epoch 4 - iter 801/894 - loss 0.06819038 - time (sec): 64.69 - samples/sec: 1204.92 - lr: 0.000034 - momentum: 0.000000
2023-10-17 17:25:53,566 epoch 4 - iter 890/894 - loss 0.06982787 - time (sec): 71.54 - samples/sec: 1204.85 - lr: 0.000033 - momentum: 0.000000
2023-10-17 17:25:53,873 ----------------------------------------------------------------------------------------------------
2023-10-17 17:25:53,874 EPOCH 4 done: loss 0.0696 - lr: 0.000033
2023-10-17 17:26:05,574 DEV : loss 0.19206839799880981 - f1-score (micro avg) 0.7691
2023-10-17 17:26:05,632 saving best model
2023-10-17 17:26:07,033 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:14,126 epoch 5 - iter 89/894 - loss 0.02919367 - time (sec): 7.09 - samples/sec: 1173.67 - lr: 0.000033 - momentum: 0.000000
2023-10-17 17:26:21,132 epoch 5 - iter 178/894 - loss 0.03698239 - time (sec): 14.10 - samples/sec: 1227.83 - lr: 0.000032 - momentum: 0.000000
2023-10-17 17:26:28,344 epoch 5 - iter 267/894 - loss 0.03943436 - time (sec): 21.31 - samples/sec: 1265.01 - lr: 0.000032 - momentum: 0.000000
2023-10-17 17:26:35,428 epoch 5 - iter 356/894 - loss 0.03761578 - time (sec): 28.39 - samples/sec: 1224.66 - lr: 0.000031 - momentum: 0.000000
2023-10-17 17:26:42,329 epoch 5 - iter 445/894 - loss 0.03479110 - time (sec): 35.29 - samples/sec: 1246.89 - lr: 0.000031 - momentum: 0.000000
2023-10-17 17:26:49,445 epoch 5 - iter 534/894 - loss 0.03644592 - time (sec): 42.41 - samples/sec: 1241.59 - lr: 0.000030 - momentum: 0.000000
2023-10-17 17:26:56,505 epoch 5 - iter 623/894 - loss 0.03666781 - time (sec): 49.47 - samples/sec: 1231.49 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:27:03,948 epoch 5 - iter 712/894 - loss 0.03950081 - time (sec): 56.91 - samples/sec: 1211.83 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:27:11,156 epoch 5 - iter 801/894 - loss 0.04385206 - time (sec): 64.12 - samples/sec: 1215.42 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:27:18,349 epoch 5 - iter 890/894 - loss 0.04340555 - time (sec): 71.31 - samples/sec: 1208.53 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:27:18,668 ----------------------------------------------------------------------------------------------------
2023-10-17 17:27:18,668 EPOCH 5 done: loss 0.0433 - lr: 0.000028
2023-10-17 17:27:29,730 DEV : loss 0.20923025906085968 - f1-score (micro avg) 0.7762
2023-10-17 17:27:29,802 saving best model
2023-10-17 17:27:31,289 ----------------------------------------------------------------------------------------------------
2023-10-17 17:27:38,444 epoch 6 - iter 89/894 - loss 0.02754922 - time (sec): 7.15 - samples/sec: 1239.20 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:27:44,996 epoch 6 - iter 178/894 - loss 0.02795744 - time (sec): 13.70 - samples/sec: 1349.10 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:27:51,425 epoch 6 - iter 267/894 - loss 0.02495614 - time (sec): 20.13 - samples/sec: 1340.51 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:27:57,843 epoch 6 - iter 356/894 - loss 0.02469061 - time (sec): 26.55 - samples/sec: 1305.08 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:28:04,897 epoch 6 - iter 445/894 - loss 0.02518819 - time (sec): 33.60 - samples/sec: 1243.31 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:28:12,457 epoch 6 - iter 534/894 - loss 0.02419714 - time (sec): 41.16 - samples/sec: 1222.06 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:28:20,359 epoch 6 - iter 623/894 - loss 0.02647867 - time (sec): 49.07 - samples/sec: 1217.81 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:28:27,726 epoch 6 - iter 712/894 - loss 0.02578653 - time (sec): 56.43 - samples/sec: 1212.14 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:28:35,280 epoch 6 - iter 801/894 - loss 0.02692607 - time (sec): 63.99 - samples/sec: 1221.29 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:28:42,735 epoch 6 - iter 890/894 - loss 0.02725347 - time (sec): 71.44 - samples/sec: 1208.54 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:28:43,065 ----------------------------------------------------------------------------------------------------
2023-10-17 17:28:43,066 EPOCH 6 done: loss 0.0277 - lr: 0.000022
2023-10-17 17:28:54,470 DEV : loss 0.26610851287841797 - f1-score (micro avg) 0.7585
2023-10-17 17:28:54,554 ----------------------------------------------------------------------------------------------------
2023-10-17 17:29:02,164 epoch 7 - iter 89/894 - loss 0.01770895 - time (sec): 7.61 - samples/sec: 1218.30 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:29:09,554 epoch 7 - iter 178/894 - loss 0.01895195 - time (sec): 15.00 - samples/sec: 1160.73 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:29:16,836 epoch 7 - iter 267/894 - loss 0.01605232 - time (sec): 22.28 - samples/sec: 1176.64 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:29:23,965 epoch 7 - iter 356/894 - loss 0.01681043 - time (sec): 29.41 - samples/sec: 1160.67 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:29:30,951 epoch 7 - iter 445/894 - loss 0.01686728 - time (sec): 36.39 - samples/sec: 1160.61 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:29:38,090 epoch 7 - iter 534/894 - loss 0.01757958 - time (sec): 43.53 - samples/sec: 1172.66 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:29:45,052 epoch 7 - iter 623/894 - loss 0.01842906 - time (sec): 50.50 - samples/sec: 1175.54 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:29:52,094 epoch 7 - iter 712/894 - loss 0.01735328 - time (sec): 57.54 - samples/sec: 1182.60 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:29:59,028 epoch 7 - iter 801/894 - loss 0.01820784 - time (sec): 64.47 - samples/sec: 1187.92 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:30:06,149 epoch 7 - iter 890/894 - loss 0.01848393 - time (sec): 71.59 - samples/sec: 1203.67 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:30:06,468 ----------------------------------------------------------------------------------------------------
2023-10-17 17:30:06,468 EPOCH 7 done: loss 0.0185 - lr: 0.000017
2023-10-17 17:30:17,968 DEV : loss 0.2510668933391571 - f1-score (micro avg) 0.7811
2023-10-17 17:30:18,033 saving best model
2023-10-17 17:30:19,580 ----------------------------------------------------------------------------------------------------
2023-10-17 17:30:26,769 epoch 8 - iter 89/894 - loss 0.00642061 - time (sec): 7.18 - samples/sec: 1213.37 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:30:34,337 epoch 8 - iter 178/894 - loss 0.01382205 - time (sec): 14.75 - samples/sec: 1157.83 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:30:41,390 epoch 8 - iter 267/894 - loss 0.01323906 - time (sec): 21.81 - samples/sec: 1188.96 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:30:48,572 epoch 8 - iter 356/894 - loss 0.01361992 - time (sec): 28.99 - samples/sec: 1176.14 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:30:55,637 epoch 8 - iter 445/894 - loss 0.01168000 - time (sec): 36.05 - samples/sec: 1174.66 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:31:02,718 epoch 8 - iter 534/894 - loss 0.01271779 - time (sec): 43.13 - samples/sec: 1194.19 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:31:09,825 epoch 8 - iter 623/894 - loss 0.01196624 - time (sec): 50.24 - samples/sec: 1194.57 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:31:16,936 epoch 8 - iter 712/894 - loss 0.01341929 - time (sec): 57.35 - samples/sec: 1194.44 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:31:24,309 epoch 8 - iter 801/894 - loss 0.01292337 - time (sec): 64.72 - samples/sec: 1208.65 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:31:31,384 epoch 8 - iter 890/894 - loss 0.01284735 - time (sec): 71.80 - samples/sec: 1201.09 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:31:31,693 ----------------------------------------------------------------------------------------------------
2023-10-17 17:31:31,693 EPOCH 8 done: loss 0.0128 - lr: 0.000011
2023-10-17 17:31:43,205 DEV : loss 0.2856707274913788 - f1-score (micro avg) 0.7775
2023-10-17 17:31:43,272 ----------------------------------------------------------------------------------------------------
2023-10-17 17:31:50,628 epoch 9 - iter 89/894 - loss 0.00785870 - time (sec): 7.35 - samples/sec: 1221.70 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:31:57,992 epoch 9 - iter 178/894 - loss 0.00424210 - time (sec): 14.72 - samples/sec: 1303.06 - lr: 0.000010 - momentum: 0.000000
2023-10-17 17:32:05,110 epoch 9 - iter 267/894 - loss 0.00429337 - time (sec): 21.84 - samples/sec: 1258.65 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:32:12,161 epoch 9 - iter 356/894 - loss 0.00467458 - time (sec): 28.89 - samples/sec: 1234.06 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:32:19,435 epoch 9 - iter 445/894 - loss 0.00520006 - time (sec): 36.16 - samples/sec: 1237.67 - lr: 0.000008 - momentum: 0.000000
2023-10-17 17:32:26,684 epoch 9 - iter 534/894 - loss 0.00494801 - time (sec): 43.41 - samples/sec: 1235.71 - lr: 0.000008 - momentum: 0.000000
2023-10-17 17:32:33,721 epoch 9 - iter 623/894 - loss 0.00570335 - time (sec): 50.45 - samples/sec: 1228.77 - lr: 0.000007 - momentum: 0.000000
2023-10-17 17:32:40,703 epoch 9 - iter 712/894 - loss 0.00668224 - time (sec): 57.43 - samples/sec: 1219.17 - lr: 0.000007 - momentum: 0.000000
2023-10-17 17:32:47,813 epoch 9 - iter 801/894 - loss 0.00640754 - time (sec): 64.54 - samples/sec: 1211.90 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:32:54,862 epoch 9 - iter 890/894 - loss 0.00668945 - time (sec): 71.59 - samples/sec: 1203.41 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:32:55,172 ----------------------------------------------------------------------------------------------------
2023-10-17 17:32:55,173 EPOCH 9 done: loss 0.0067 - lr: 0.000006
2023-10-17 17:33:06,235 DEV : loss 0.2777999937534332 - f1-score (micro avg) 0.7784
2023-10-17 17:33:06,291 ----------------------------------------------------------------------------------------------------
2023-10-17 17:33:13,306 epoch 10 - iter 89/894 - loss 0.00657773 - time (sec): 7.01 - samples/sec: 1229.27 - lr: 0.000005 - momentum: 0.000000
2023-10-17 17:33:20,345 epoch 10 - iter 178/894 - loss 0.00391865 - time (sec): 14.05 - samples/sec: 1205.70 - lr: 0.000004 - momentum: 0.000000
2023-10-17 17:33:27,294 epoch 10 - iter 267/894 - loss 0.00273605 - time (sec): 21.00 - samples/sec: 1190.76 - lr: 0.000004 - momentum: 0.000000
2023-10-17 17:33:34,693 epoch 10 - iter 356/894 - loss 0.00243882 - time (sec): 28.40 - samples/sec: 1187.39 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:33:42,145 epoch 10 - iter 445/894 - loss 0.00353196 - time (sec): 35.85 - samples/sec: 1188.92 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:33:49,493 epoch 10 - iter 534/894 - loss 0.00395560 - time (sec): 43.20 - samples/sec: 1204.94 - lr: 0.000002 - momentum: 0.000000
2023-10-17 17:33:56,461 epoch 10 - iter 623/894 - loss 0.00434004 - time (sec): 50.17 - samples/sec: 1200.65 - lr: 0.000002 - momentum: 0.000000
2023-10-17 17:34:03,804 epoch 10 - iter 712/894 - loss 0.00458182 - time (sec): 57.51 - samples/sec: 1202.46 - lr: 0.000001 - momentum: 0.000000
2023-10-17 17:34:11,918 epoch 10 - iter 801/894 - loss 0.00468585 - time (sec): 65.62 - samples/sec: 1179.89 - lr: 0.000001 - momentum: 0.000000
2023-10-17 17:34:19,228 epoch 10 - iter 890/894 - loss 0.00452729 - time (sec): 72.94 - samples/sec: 1181.35 - lr: 0.000000 - momentum: 0.000000
2023-10-17 17:34:19,542 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:19,543 EPOCH 10 done: loss 0.0046 - lr: 0.000000
2023-10-17 17:34:30,411 DEV : loss 0.2798316776752472 - f1-score (micro avg) 0.7858
2023-10-17 17:34:30,474 saving best model
2023-10-17 17:34:32,485 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:32,487 Loading model from best epoch ...
2023-10-17 17:34:34,890 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time
2023-10-17 17:34:41,204
Results:
- F-score (micro) 0.7491
- F-score (macro) 0.6739
- Accuracy 0.6123
By class:
precision recall f1-score support
loc 0.8522 0.8221 0.8369 596
pers 0.7022 0.7508 0.7257 333
org 0.5077 0.5000 0.5038 132
prod 0.6383 0.4545 0.5310 66
time 0.7500 0.7959 0.7723 49
micro avg 0.7543 0.7440 0.7491 1176
macro avg 0.6901 0.6647 0.6739 1176
weighted avg 0.7548 0.7440 0.7482 1176
2023-10-17 17:34:41,204 ----------------------------------------------------------------------------------------------------