Upload ./training.log with huggingface_hub
Browse files- training.log +245 -0
training.log
ADDED
@@ -0,0 +1,245 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-23 15:45:20,702 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-23 15:45:20,703 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(64001, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-11): 12 x BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
(pooler): BertPooler(
|
41 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
42 |
+
(activation): Tanh()
|
43 |
+
)
|
44 |
+
)
|
45 |
+
)
|
46 |
+
(locked_dropout): LockedDropout(p=0.5)
|
47 |
+
(linear): Linear(in_features=768, out_features=25, bias=True)
|
48 |
+
(loss_function): CrossEntropyLoss()
|
49 |
+
)"
|
50 |
+
2023-10-23 15:45:20,703 ----------------------------------------------------------------------------------------------------
|
51 |
+
2023-10-23 15:45:20,703 MultiCorpus: 1100 train + 206 dev + 240 test sentences
|
52 |
+
- NER_HIPE_2022 Corpus: 1100 train + 206 dev + 240 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/ajmc/de/with_doc_seperator
|
53 |
+
2023-10-23 15:45:20,703 ----------------------------------------------------------------------------------------------------
|
54 |
+
2023-10-23 15:45:20,703 Train: 1100 sentences
|
55 |
+
2023-10-23 15:45:20,703 (train_with_dev=False, train_with_test=False)
|
56 |
+
2023-10-23 15:45:20,704 ----------------------------------------------------------------------------------------------------
|
57 |
+
2023-10-23 15:45:20,704 Training Params:
|
58 |
+
2023-10-23 15:45:20,704 - learning_rate: "3e-05"
|
59 |
+
2023-10-23 15:45:20,704 - mini_batch_size: "8"
|
60 |
+
2023-10-23 15:45:20,704 - max_epochs: "10"
|
61 |
+
2023-10-23 15:45:20,704 - shuffle: "True"
|
62 |
+
2023-10-23 15:45:20,704 ----------------------------------------------------------------------------------------------------
|
63 |
+
2023-10-23 15:45:20,704 Plugins:
|
64 |
+
2023-10-23 15:45:20,704 - TensorboardLogger
|
65 |
+
2023-10-23 15:45:20,704 - LinearScheduler | warmup_fraction: '0.1'
|
66 |
+
2023-10-23 15:45:20,704 ----------------------------------------------------------------------------------------------------
|
67 |
+
2023-10-23 15:45:20,704 Final evaluation on model from best epoch (best-model.pt)
|
68 |
+
2023-10-23 15:45:20,704 - metric: "('micro avg', 'f1-score')"
|
69 |
+
2023-10-23 15:45:20,704 ----------------------------------------------------------------------------------------------------
|
70 |
+
2023-10-23 15:45:20,704 Computation:
|
71 |
+
2023-10-23 15:45:20,704 - compute on device: cuda:0
|
72 |
+
2023-10-23 15:45:20,704 - embedding storage: none
|
73 |
+
2023-10-23 15:45:20,704 ----------------------------------------------------------------------------------------------------
|
74 |
+
2023-10-23 15:45:20,704 Model training base path: "hmbench-ajmc/de-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4"
|
75 |
+
2023-10-23 15:45:20,704 ----------------------------------------------------------------------------------------------------
|
76 |
+
2023-10-23 15:45:20,704 ----------------------------------------------------------------------------------------------------
|
77 |
+
2023-10-23 15:45:20,704 Logging anything other than scalars to TensorBoard is currently not supported.
|
78 |
+
2023-10-23 15:45:21,458 epoch 1 - iter 13/138 - loss 3.19724143 - time (sec): 0.75 - samples/sec: 2797.24 - lr: 0.000003 - momentum: 0.000000
|
79 |
+
2023-10-23 15:45:22,219 epoch 1 - iter 26/138 - loss 2.76719642 - time (sec): 1.51 - samples/sec: 2812.76 - lr: 0.000005 - momentum: 0.000000
|
80 |
+
2023-10-23 15:45:22,939 epoch 1 - iter 39/138 - loss 2.27990881 - time (sec): 2.23 - samples/sec: 2769.62 - lr: 0.000008 - momentum: 0.000000
|
81 |
+
2023-10-23 15:45:23,707 epoch 1 - iter 52/138 - loss 1.87919582 - time (sec): 3.00 - samples/sec: 2883.32 - lr: 0.000011 - momentum: 0.000000
|
82 |
+
2023-10-23 15:45:24,444 epoch 1 - iter 65/138 - loss 1.68296880 - time (sec): 3.74 - samples/sec: 2837.09 - lr: 0.000014 - momentum: 0.000000
|
83 |
+
2023-10-23 15:45:25,198 epoch 1 - iter 78/138 - loss 1.47784903 - time (sec): 4.49 - samples/sec: 2858.47 - lr: 0.000017 - momentum: 0.000000
|
84 |
+
2023-10-23 15:45:25,974 epoch 1 - iter 91/138 - loss 1.33890195 - time (sec): 5.27 - samples/sec: 2799.70 - lr: 0.000020 - momentum: 0.000000
|
85 |
+
2023-10-23 15:45:26,758 epoch 1 - iter 104/138 - loss 1.19646330 - time (sec): 6.05 - samples/sec: 2862.84 - lr: 0.000022 - momentum: 0.000000
|
86 |
+
2023-10-23 15:45:27,491 epoch 1 - iter 117/138 - loss 1.09649998 - time (sec): 6.79 - samples/sec: 2869.11 - lr: 0.000025 - momentum: 0.000000
|
87 |
+
2023-10-23 15:45:28,210 epoch 1 - iter 130/138 - loss 1.02207775 - time (sec): 7.50 - samples/sec: 2888.81 - lr: 0.000028 - momentum: 0.000000
|
88 |
+
2023-10-23 15:45:28,673 ----------------------------------------------------------------------------------------------------
|
89 |
+
2023-10-23 15:45:28,673 EPOCH 1 done: loss 0.9842 - lr: 0.000028
|
90 |
+
2023-10-23 15:45:29,262 DEV : loss 0.2637031376361847 - f1-score (micro avg) 0.6608
|
91 |
+
2023-10-23 15:45:29,268 saving best model
|
92 |
+
2023-10-23 15:45:29,668 ----------------------------------------------------------------------------------------------------
|
93 |
+
2023-10-23 15:45:30,399 epoch 2 - iter 13/138 - loss 0.28211232 - time (sec): 0.73 - samples/sec: 2905.37 - lr: 0.000030 - momentum: 0.000000
|
94 |
+
2023-10-23 15:45:31,138 epoch 2 - iter 26/138 - loss 0.23645117 - time (sec): 1.47 - samples/sec: 3020.01 - lr: 0.000029 - momentum: 0.000000
|
95 |
+
2023-10-23 15:45:31,865 epoch 2 - iter 39/138 - loss 0.22703358 - time (sec): 2.20 - samples/sec: 3079.58 - lr: 0.000029 - momentum: 0.000000
|
96 |
+
2023-10-23 15:45:32,595 epoch 2 - iter 52/138 - loss 0.20824501 - time (sec): 2.93 - samples/sec: 3097.73 - lr: 0.000029 - momentum: 0.000000
|
97 |
+
2023-10-23 15:45:33,325 epoch 2 - iter 65/138 - loss 0.20724269 - time (sec): 3.66 - samples/sec: 3049.60 - lr: 0.000028 - momentum: 0.000000
|
98 |
+
2023-10-23 15:45:34,054 epoch 2 - iter 78/138 - loss 0.19916937 - time (sec): 4.38 - samples/sec: 3051.76 - lr: 0.000028 - momentum: 0.000000
|
99 |
+
2023-10-23 15:45:34,793 epoch 2 - iter 91/138 - loss 0.19649440 - time (sec): 5.12 - samples/sec: 3032.09 - lr: 0.000028 - momentum: 0.000000
|
100 |
+
2023-10-23 15:45:35,531 epoch 2 - iter 104/138 - loss 0.19268046 - time (sec): 5.86 - samples/sec: 3006.97 - lr: 0.000028 - momentum: 0.000000
|
101 |
+
2023-10-23 15:45:36,270 epoch 2 - iter 117/138 - loss 0.18827503 - time (sec): 6.60 - samples/sec: 2961.56 - lr: 0.000027 - momentum: 0.000000
|
102 |
+
2023-10-23 15:45:36,989 epoch 2 - iter 130/138 - loss 0.18348349 - time (sec): 7.32 - samples/sec: 2951.24 - lr: 0.000027 - momentum: 0.000000
|
103 |
+
2023-10-23 15:45:37,429 ----------------------------------------------------------------------------------------------------
|
104 |
+
2023-10-23 15:45:37,429 EPOCH 2 done: loss 0.1787 - lr: 0.000027
|
105 |
+
2023-10-23 15:45:37,959 DEV : loss 0.11417855322360992 - f1-score (micro avg) 0.8383
|
106 |
+
2023-10-23 15:45:37,965 saving best model
|
107 |
+
2023-10-23 15:45:38,486 ----------------------------------------------------------------------------------------------------
|
108 |
+
2023-10-23 15:45:39,203 epoch 3 - iter 13/138 - loss 0.10319665 - time (sec): 0.72 - samples/sec: 2882.11 - lr: 0.000026 - momentum: 0.000000
|
109 |
+
2023-10-23 15:45:39,918 epoch 3 - iter 26/138 - loss 0.12345856 - time (sec): 1.43 - samples/sec: 2842.55 - lr: 0.000026 - momentum: 0.000000
|
110 |
+
2023-10-23 15:45:40,633 epoch 3 - iter 39/138 - loss 0.10479944 - time (sec): 2.15 - samples/sec: 2931.77 - lr: 0.000026 - momentum: 0.000000
|
111 |
+
2023-10-23 15:45:41,366 epoch 3 - iter 52/138 - loss 0.09914611 - time (sec): 2.88 - samples/sec: 2899.42 - lr: 0.000025 - momentum: 0.000000
|
112 |
+
2023-10-23 15:45:42,091 epoch 3 - iter 65/138 - loss 0.09899777 - time (sec): 3.60 - samples/sec: 2853.25 - lr: 0.000025 - momentum: 0.000000
|
113 |
+
2023-10-23 15:45:42,825 epoch 3 - iter 78/138 - loss 0.09442590 - time (sec): 4.34 - samples/sec: 2890.44 - lr: 0.000025 - momentum: 0.000000
|
114 |
+
2023-10-23 15:45:43,552 epoch 3 - iter 91/138 - loss 0.09932234 - time (sec): 5.06 - samples/sec: 2903.82 - lr: 0.000025 - momentum: 0.000000
|
115 |
+
2023-10-23 15:45:44,285 epoch 3 - iter 104/138 - loss 0.09677320 - time (sec): 5.80 - samples/sec: 2914.91 - lr: 0.000024 - momentum: 0.000000
|
116 |
+
2023-10-23 15:45:45,028 epoch 3 - iter 117/138 - loss 0.09735892 - time (sec): 6.54 - samples/sec: 2929.20 - lr: 0.000024 - momentum: 0.000000
|
117 |
+
2023-10-23 15:45:45,787 epoch 3 - iter 130/138 - loss 0.09842761 - time (sec): 7.30 - samples/sec: 2957.18 - lr: 0.000024 - momentum: 0.000000
|
118 |
+
2023-10-23 15:45:46,248 ----------------------------------------------------------------------------------------------------
|
119 |
+
2023-10-23 15:45:46,248 EPOCH 3 done: loss 0.0976 - lr: 0.000024
|
120 |
+
2023-10-23 15:45:46,785 DEV : loss 0.11052478104829788 - f1-score (micro avg) 0.8561
|
121 |
+
2023-10-23 15:45:46,791 saving best model
|
122 |
+
2023-10-23 15:45:47,338 ----------------------------------------------------------------------------------------------------
|
123 |
+
2023-10-23 15:45:48,055 epoch 4 - iter 13/138 - loss 0.05429172 - time (sec): 0.72 - samples/sec: 3108.04 - lr: 0.000023 - momentum: 0.000000
|
124 |
+
2023-10-23 15:45:48,769 epoch 4 - iter 26/138 - loss 0.06880590 - time (sec): 1.43 - samples/sec: 2926.31 - lr: 0.000023 - momentum: 0.000000
|
125 |
+
2023-10-23 15:45:49,509 epoch 4 - iter 39/138 - loss 0.06536197 - time (sec): 2.17 - samples/sec: 2878.70 - lr: 0.000022 - momentum: 0.000000
|
126 |
+
2023-10-23 15:45:50,329 epoch 4 - iter 52/138 - loss 0.06633934 - time (sec): 2.99 - samples/sec: 2785.88 - lr: 0.000022 - momentum: 0.000000
|
127 |
+
2023-10-23 15:45:51,093 epoch 4 - iter 65/138 - loss 0.05923637 - time (sec): 3.75 - samples/sec: 2732.95 - lr: 0.000022 - momentum: 0.000000
|
128 |
+
2023-10-23 15:45:51,893 epoch 4 - iter 78/138 - loss 0.06130330 - time (sec): 4.55 - samples/sec: 2740.28 - lr: 0.000021 - momentum: 0.000000
|
129 |
+
2023-10-23 15:45:52,664 epoch 4 - iter 91/138 - loss 0.06339566 - time (sec): 5.32 - samples/sec: 2772.24 - lr: 0.000021 - momentum: 0.000000
|
130 |
+
2023-10-23 15:45:53,410 epoch 4 - iter 104/138 - loss 0.06287406 - time (sec): 6.07 - samples/sec: 2804.47 - lr: 0.000021 - momentum: 0.000000
|
131 |
+
2023-10-23 15:45:54,195 epoch 4 - iter 117/138 - loss 0.06738814 - time (sec): 6.86 - samples/sec: 2815.45 - lr: 0.000021 - momentum: 0.000000
|
132 |
+
2023-10-23 15:45:54,976 epoch 4 - iter 130/138 - loss 0.06977085 - time (sec): 7.64 - samples/sec: 2801.72 - lr: 0.000020 - momentum: 0.000000
|
133 |
+
2023-10-23 15:45:55,461 ----------------------------------------------------------------------------------------------------
|
134 |
+
2023-10-23 15:45:55,462 EPOCH 4 done: loss 0.0682 - lr: 0.000020
|
135 |
+
2023-10-23 15:45:56,002 DEV : loss 0.12191484868526459 - f1-score (micro avg) 0.8525
|
136 |
+
2023-10-23 15:45:56,008 ----------------------------------------------------------------------------------------------------
|
137 |
+
2023-10-23 15:45:56,772 epoch 5 - iter 13/138 - loss 0.05204482 - time (sec): 0.76 - samples/sec: 2725.06 - lr: 0.000020 - momentum: 0.000000
|
138 |
+
2023-10-23 15:45:57,535 epoch 5 - iter 26/138 - loss 0.04270482 - time (sec): 1.53 - samples/sec: 2765.10 - lr: 0.000019 - momentum: 0.000000
|
139 |
+
2023-10-23 15:45:58,316 epoch 5 - iter 39/138 - loss 0.05189148 - time (sec): 2.31 - samples/sec: 2775.05 - lr: 0.000019 - momentum: 0.000000
|
140 |
+
2023-10-23 15:45:59,077 epoch 5 - iter 52/138 - loss 0.05066165 - time (sec): 3.07 - samples/sec: 2810.13 - lr: 0.000019 - momentum: 0.000000
|
141 |
+
2023-10-23 15:45:59,823 epoch 5 - iter 65/138 - loss 0.04849827 - time (sec): 3.81 - samples/sec: 2810.73 - lr: 0.000018 - momentum: 0.000000
|
142 |
+
2023-10-23 15:46:00,558 epoch 5 - iter 78/138 - loss 0.05075800 - time (sec): 4.55 - samples/sec: 2872.86 - lr: 0.000018 - momentum: 0.000000
|
143 |
+
2023-10-23 15:46:01,281 epoch 5 - iter 91/138 - loss 0.05255442 - time (sec): 5.27 - samples/sec: 2855.43 - lr: 0.000018 - momentum: 0.000000
|
144 |
+
2023-10-23 15:46:02,021 epoch 5 - iter 104/138 - loss 0.05109410 - time (sec): 6.01 - samples/sec: 2826.82 - lr: 0.000018 - momentum: 0.000000
|
145 |
+
2023-10-23 15:46:02,759 epoch 5 - iter 117/138 - loss 0.05205831 - time (sec): 6.75 - samples/sec: 2876.34 - lr: 0.000017 - momentum: 0.000000
|
146 |
+
2023-10-23 15:46:03,501 epoch 5 - iter 130/138 - loss 0.05049339 - time (sec): 7.49 - samples/sec: 2884.60 - lr: 0.000017 - momentum: 0.000000
|
147 |
+
2023-10-23 15:46:03,952 ----------------------------------------------------------------------------------------------------
|
148 |
+
2023-10-23 15:46:03,952 EPOCH 5 done: loss 0.0485 - lr: 0.000017
|
149 |
+
2023-10-23 15:46:04,497 DEV : loss 0.13699379563331604 - f1-score (micro avg) 0.8779
|
150 |
+
2023-10-23 15:46:04,503 saving best model
|
151 |
+
2023-10-23 15:46:05,045 ----------------------------------------------------------------------------------------------------
|
152 |
+
2023-10-23 15:46:05,792 epoch 6 - iter 13/138 - loss 0.02970998 - time (sec): 0.74 - samples/sec: 2833.02 - lr: 0.000016 - momentum: 0.000000
|
153 |
+
2023-10-23 15:46:06,563 epoch 6 - iter 26/138 - loss 0.04688615 - time (sec): 1.51 - samples/sec: 2901.61 - lr: 0.000016 - momentum: 0.000000
|
154 |
+
2023-10-23 15:46:07,321 epoch 6 - iter 39/138 - loss 0.04223240 - time (sec): 2.27 - samples/sec: 2860.93 - lr: 0.000016 - momentum: 0.000000
|
155 |
+
2023-10-23 15:46:08,072 epoch 6 - iter 52/138 - loss 0.03535856 - time (sec): 3.02 - samples/sec: 2838.96 - lr: 0.000015 - momentum: 0.000000
|
156 |
+
2023-10-23 15:46:08,804 epoch 6 - iter 65/138 - loss 0.03293734 - time (sec): 3.75 - samples/sec: 2851.14 - lr: 0.000015 - momentum: 0.000000
|
157 |
+
2023-10-23 15:46:09,564 epoch 6 - iter 78/138 - loss 0.03132917 - time (sec): 4.51 - samples/sec: 2937.99 - lr: 0.000015 - momentum: 0.000000
|
158 |
+
2023-10-23 15:46:10,329 epoch 6 - iter 91/138 - loss 0.03035412 - time (sec): 5.28 - samples/sec: 2925.95 - lr: 0.000015 - momentum: 0.000000
|
159 |
+
2023-10-23 15:46:11,059 epoch 6 - iter 104/138 - loss 0.03635085 - time (sec): 6.01 - samples/sec: 2897.63 - lr: 0.000014 - momentum: 0.000000
|
160 |
+
2023-10-23 15:46:11,799 epoch 6 - iter 117/138 - loss 0.03792811 - time (sec): 6.75 - samples/sec: 2906.81 - lr: 0.000014 - momentum: 0.000000
|
161 |
+
2023-10-23 15:46:12,534 epoch 6 - iter 130/138 - loss 0.03734866 - time (sec): 7.48 - samples/sec: 2868.22 - lr: 0.000014 - momentum: 0.000000
|
162 |
+
2023-10-23 15:46:12,987 ----------------------------------------------------------------------------------------------------
|
163 |
+
2023-10-23 15:46:12,988 EPOCH 6 done: loss 0.0372 - lr: 0.000014
|
164 |
+
2023-10-23 15:46:13,519 DEV : loss 0.139994814991951 - f1-score (micro avg) 0.8785
|
165 |
+
2023-10-23 15:46:13,524 saving best model
|
166 |
+
2023-10-23 15:46:14,054 ----------------------------------------------------------------------------------------------------
|
167 |
+
2023-10-23 15:46:14,817 epoch 7 - iter 13/138 - loss 0.00842643 - time (sec): 0.76 - samples/sec: 2797.23 - lr: 0.000013 - momentum: 0.000000
|
168 |
+
2023-10-23 15:46:15,575 epoch 7 - iter 26/138 - loss 0.01783786 - time (sec): 1.52 - samples/sec: 2846.49 - lr: 0.000013 - momentum: 0.000000
|
169 |
+
2023-10-23 15:46:16,341 epoch 7 - iter 39/138 - loss 0.04117613 - time (sec): 2.28 - samples/sec: 2841.39 - lr: 0.000012 - momentum: 0.000000
|
170 |
+
2023-10-23 15:46:17,120 epoch 7 - iter 52/138 - loss 0.03560924 - time (sec): 3.06 - samples/sec: 2927.84 - lr: 0.000012 - momentum: 0.000000
|
171 |
+
2023-10-23 15:46:17,877 epoch 7 - iter 65/138 - loss 0.03352044 - time (sec): 3.82 - samples/sec: 2870.05 - lr: 0.000012 - momentum: 0.000000
|
172 |
+
2023-10-23 15:46:18,622 epoch 7 - iter 78/138 - loss 0.03277726 - time (sec): 4.56 - samples/sec: 2886.77 - lr: 0.000012 - momentum: 0.000000
|
173 |
+
2023-10-23 15:46:19,352 epoch 7 - iter 91/138 - loss 0.03439841 - time (sec): 5.29 - samples/sec: 2866.31 - lr: 0.000011 - momentum: 0.000000
|
174 |
+
2023-10-23 15:46:20,108 epoch 7 - iter 104/138 - loss 0.03154978 - time (sec): 6.05 - samples/sec: 2891.32 - lr: 0.000011 - momentum: 0.000000
|
175 |
+
2023-10-23 15:46:20,868 epoch 7 - iter 117/138 - loss 0.03023976 - time (sec): 6.81 - samples/sec: 2891.55 - lr: 0.000011 - momentum: 0.000000
|
176 |
+
2023-10-23 15:46:21,610 epoch 7 - iter 130/138 - loss 0.02991660 - time (sec): 7.55 - samples/sec: 2868.59 - lr: 0.000010 - momentum: 0.000000
|
177 |
+
2023-10-23 15:46:22,079 ----------------------------------------------------------------------------------------------------
|
178 |
+
2023-10-23 15:46:22,079 EPOCH 7 done: loss 0.0299 - lr: 0.000010
|
179 |
+
2023-10-23 15:46:22,617 DEV : loss 0.14777213335037231 - f1-score (micro avg) 0.897
|
180 |
+
2023-10-23 15:46:22,623 saving best model
|
181 |
+
2023-10-23 15:46:23,152 ----------------------------------------------------------------------------------------------------
|
182 |
+
2023-10-23 15:46:23,905 epoch 8 - iter 13/138 - loss 0.00644670 - time (sec): 0.75 - samples/sec: 2716.21 - lr: 0.000010 - momentum: 0.000000
|
183 |
+
2023-10-23 15:46:24,659 epoch 8 - iter 26/138 - loss 0.01964341 - time (sec): 1.50 - samples/sec: 2951.47 - lr: 0.000009 - momentum: 0.000000
|
184 |
+
2023-10-23 15:46:25,389 epoch 8 - iter 39/138 - loss 0.01628448 - time (sec): 2.23 - samples/sec: 2947.00 - lr: 0.000009 - momentum: 0.000000
|
185 |
+
2023-10-23 15:46:26,144 epoch 8 - iter 52/138 - loss 0.02420344 - time (sec): 2.99 - samples/sec: 2926.35 - lr: 0.000009 - momentum: 0.000000
|
186 |
+
2023-10-23 15:46:26,896 epoch 8 - iter 65/138 - loss 0.02319288 - time (sec): 3.74 - samples/sec: 2863.46 - lr: 0.000009 - momentum: 0.000000
|
187 |
+
2023-10-23 15:46:27,676 epoch 8 - iter 78/138 - loss 0.02179369 - time (sec): 4.52 - samples/sec: 2866.48 - lr: 0.000008 - momentum: 0.000000
|
188 |
+
2023-10-23 15:46:28,411 epoch 8 - iter 91/138 - loss 0.02597878 - time (sec): 5.26 - samples/sec: 2886.46 - lr: 0.000008 - momentum: 0.000000
|
189 |
+
2023-10-23 15:46:29,155 epoch 8 - iter 104/138 - loss 0.02487449 - time (sec): 6.00 - samples/sec: 2852.95 - lr: 0.000008 - momentum: 0.000000
|
190 |
+
2023-10-23 15:46:29,879 epoch 8 - iter 117/138 - loss 0.02372089 - time (sec): 6.72 - samples/sec: 2861.21 - lr: 0.000007 - momentum: 0.000000
|
191 |
+
2023-10-23 15:46:30,603 epoch 8 - iter 130/138 - loss 0.02257563 - time (sec): 7.45 - samples/sec: 2886.61 - lr: 0.000007 - momentum: 0.000000
|
192 |
+
2023-10-23 15:46:31,039 ----------------------------------------------------------------------------------------------------
|
193 |
+
2023-10-23 15:46:31,039 EPOCH 8 done: loss 0.0222 - lr: 0.000007
|
194 |
+
2023-10-23 15:46:31,571 DEV : loss 0.15574227273464203 - f1-score (micro avg) 0.8838
|
195 |
+
2023-10-23 15:46:31,577 ----------------------------------------------------------------------------------------------------
|
196 |
+
2023-10-23 15:46:32,300 epoch 9 - iter 13/138 - loss 0.00660491 - time (sec): 0.72 - samples/sec: 2887.41 - lr: 0.000006 - momentum: 0.000000
|
197 |
+
2023-10-23 15:46:33,022 epoch 9 - iter 26/138 - loss 0.01041455 - time (sec): 1.44 - samples/sec: 2851.88 - lr: 0.000006 - momentum: 0.000000
|
198 |
+
2023-10-23 15:46:33,740 epoch 9 - iter 39/138 - loss 0.01005111 - time (sec): 2.16 - samples/sec: 2913.14 - lr: 0.000006 - momentum: 0.000000
|
199 |
+
2023-10-23 15:46:34,486 epoch 9 - iter 52/138 - loss 0.01334769 - time (sec): 2.91 - samples/sec: 2895.30 - lr: 0.000005 - momentum: 0.000000
|
200 |
+
2023-10-23 15:46:35,204 epoch 9 - iter 65/138 - loss 0.01186627 - time (sec): 3.63 - samples/sec: 2936.89 - lr: 0.000005 - momentum: 0.000000
|
201 |
+
2023-10-23 15:46:35,960 epoch 9 - iter 78/138 - loss 0.01168564 - time (sec): 4.38 - samples/sec: 2939.27 - lr: 0.000005 - momentum: 0.000000
|
202 |
+
2023-10-23 15:46:36,703 epoch 9 - iter 91/138 - loss 0.01330314 - time (sec): 5.13 - samples/sec: 2952.30 - lr: 0.000005 - momentum: 0.000000
|
203 |
+
2023-10-23 15:46:37,441 epoch 9 - iter 104/138 - loss 0.01202213 - time (sec): 5.86 - samples/sec: 2950.18 - lr: 0.000004 - momentum: 0.000000
|
204 |
+
2023-10-23 15:46:38,187 epoch 9 - iter 117/138 - loss 0.01286806 - time (sec): 6.61 - samples/sec: 2935.01 - lr: 0.000004 - momentum: 0.000000
|
205 |
+
2023-10-23 15:46:38,919 epoch 9 - iter 130/138 - loss 0.01425818 - time (sec): 7.34 - samples/sec: 2935.17 - lr: 0.000004 - momentum: 0.000000
|
206 |
+
2023-10-23 15:46:39,337 ----------------------------------------------------------------------------------------------------
|
207 |
+
2023-10-23 15:46:39,337 EPOCH 9 done: loss 0.0178 - lr: 0.000004
|
208 |
+
2023-10-23 15:46:39,872 DEV : loss 0.16283106803894043 - f1-score (micro avg) 0.8916
|
209 |
+
2023-10-23 15:46:39,877 ----------------------------------------------------------------------------------------------------
|
210 |
+
2023-10-23 15:46:40,602 epoch 10 - iter 13/138 - loss 0.00433738 - time (sec): 0.72 - samples/sec: 2968.37 - lr: 0.000003 - momentum: 0.000000
|
211 |
+
2023-10-23 15:46:41,333 epoch 10 - iter 26/138 - loss 0.00331970 - time (sec): 1.45 - samples/sec: 2916.40 - lr: 0.000003 - momentum: 0.000000
|
212 |
+
2023-10-23 15:46:42,080 epoch 10 - iter 39/138 - loss 0.00840588 - time (sec): 2.20 - samples/sec: 3008.39 - lr: 0.000002 - momentum: 0.000000
|
213 |
+
2023-10-23 15:46:42,816 epoch 10 - iter 52/138 - loss 0.00903499 - time (sec): 2.94 - samples/sec: 2948.34 - lr: 0.000002 - momentum: 0.000000
|
214 |
+
2023-10-23 15:46:43,536 epoch 10 - iter 65/138 - loss 0.01232935 - time (sec): 3.66 - samples/sec: 2906.92 - lr: 0.000002 - momentum: 0.000000
|
215 |
+
2023-10-23 15:46:44,266 epoch 10 - iter 78/138 - loss 0.01223517 - time (sec): 4.39 - samples/sec: 2933.48 - lr: 0.000002 - momentum: 0.000000
|
216 |
+
2023-10-23 15:46:44,999 epoch 10 - iter 91/138 - loss 0.01251053 - time (sec): 5.12 - samples/sec: 2922.86 - lr: 0.000001 - momentum: 0.000000
|
217 |
+
2023-10-23 15:46:45,750 epoch 10 - iter 104/138 - loss 0.01268933 - time (sec): 5.87 - samples/sec: 2919.33 - lr: 0.000001 - momentum: 0.000000
|
218 |
+
2023-10-23 15:46:46,455 epoch 10 - iter 117/138 - loss 0.01157601 - time (sec): 6.58 - samples/sec: 2928.72 - lr: 0.000001 - momentum: 0.000000
|
219 |
+
2023-10-23 15:46:47,182 epoch 10 - iter 130/138 - loss 0.01176484 - time (sec): 7.30 - samples/sec: 2944.52 - lr: 0.000000 - momentum: 0.000000
|
220 |
+
2023-10-23 15:46:47,618 ----------------------------------------------------------------------------------------------------
|
221 |
+
2023-10-23 15:46:47,619 EPOCH 10 done: loss 0.0140 - lr: 0.000000
|
222 |
+
2023-10-23 15:46:48,151 DEV : loss 0.16455598175525665 - f1-score (micro avg) 0.8926
|
223 |
+
2023-10-23 15:46:48,549 ----------------------------------------------------------------------------------------------------
|
224 |
+
2023-10-23 15:46:48,550 Loading model from best epoch ...
|
225 |
+
2023-10-23 15:46:50,228 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
|
226 |
+
2023-10-23 15:46:50,881
|
227 |
+
Results:
|
228 |
+
- F-score (micro) 0.9084
|
229 |
+
- F-score (macro) 0.8767
|
230 |
+
- Accuracy 0.8382
|
231 |
+
|
232 |
+
By class:
|
233 |
+
precision recall f1-score support
|
234 |
+
|
235 |
+
scope 0.8771 0.8920 0.8845 176
|
236 |
+
pers 0.9542 0.9766 0.9653 128
|
237 |
+
work 0.8986 0.8378 0.8671 74
|
238 |
+
object 1.0000 1.0000 1.0000 2
|
239 |
+
loc 1.0000 0.5000 0.6667 2
|
240 |
+
|
241 |
+
micro avg 0.9084 0.9084 0.9084 382
|
242 |
+
macro avg 0.9460 0.8413 0.8767 382
|
243 |
+
weighted avg 0.9084 0.9084 0.9077 382
|
244 |
+
|
245 |
+
2023-10-23 15:46:50,881 ----------------------------------------------------------------------------------------------------
|