Upload folder using huggingface_hub
Browse files- best-model.pt +3 -0
- dev.tsv +0 -0
- loss.tsv +11 -0
- runs/events.out.tfevents.1697540060.bce904bcef33.2023.1 +3 -0
- test.tsv +0 -0
- training.log +237 -0
best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:680446c99fc3ed5ae28e1ca18d3d30019b8ce12d3e2590d5201b6fcaf9e59806
|
3 |
+
size 440941957
|
dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 10:55:53 0.0000 0.3305 0.3863 0.0169 0.0023 0.0040 0.0020
|
3 |
+
2 10:57:27 0.0000 0.1296 0.1098 0.7564 0.6991 0.7266 0.5830
|
4 |
+
3 10:59:03 0.0000 0.0927 0.1182 0.7643 0.7704 0.7673 0.6418
|
5 |
+
4 11:00:37 0.0000 0.0722 0.1527 0.7703 0.7624 0.7663 0.6358
|
6 |
+
5 11:02:11 0.0000 0.0539 0.1992 0.7436 0.7579 0.7507 0.6198
|
7 |
+
6 11:03:47 0.0000 0.0429 0.1953 0.7357 0.7839 0.7590 0.6317
|
8 |
+
7 11:05:23 0.0000 0.0292 0.2106 0.7495 0.7817 0.7652 0.6386
|
9 |
+
8 11:06:57 0.0000 0.0189 0.2515 0.7465 0.7828 0.7642 0.6337
|
10 |
+
9 11:08:31 0.0000 0.0136 0.2389 0.7415 0.7885 0.7643 0.6354
|
11 |
+
10 11:10:03 0.0000 0.0076 0.2602 0.7566 0.7805 0.7684 0.6389
|
runs/events.out.tfevents.1697540060.bce904bcef33.2023.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:52ea632b95cf1b2e6d58dd1a39bf0ad051a86a0a6cd194ee7ef8f9c7c23a1d95
|
3 |
+
size 1108164
|
test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
training.log
ADDED
@@ -0,0 +1,237 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-17 10:54:20,438 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-17 10:54:20,439 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): ElectraModel(
|
5 |
+
(embeddings): ElectraEmbeddings(
|
6 |
+
(word_embeddings): Embedding(32001, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): ElectraEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-11): 12 x ElectraLayer(
|
15 |
+
(attention): ElectraAttention(
|
16 |
+
(self): ElectraSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): ElectraSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): ElectraIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): ElectraOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
)
|
41 |
+
)
|
42 |
+
(locked_dropout): LockedDropout(p=0.5)
|
43 |
+
(linear): Linear(in_features=768, out_features=13, bias=True)
|
44 |
+
(loss_function): CrossEntropyLoss()
|
45 |
+
)"
|
46 |
+
2023-10-17 10:54:20,439 ----------------------------------------------------------------------------------------------------
|
47 |
+
2023-10-17 10:54:20,440 MultiCorpus: 7936 train + 992 dev + 992 test sentences
|
48 |
+
- NER_ICDAR_EUROPEANA Corpus: 7936 train + 992 dev + 992 test sentences - /root/.flair/datasets/ner_icdar_europeana/fr
|
49 |
+
2023-10-17 10:54:20,440 ----------------------------------------------------------------------------------------------------
|
50 |
+
2023-10-17 10:54:20,440 Train: 7936 sentences
|
51 |
+
2023-10-17 10:54:20,440 (train_with_dev=False, train_with_test=False)
|
52 |
+
2023-10-17 10:54:20,440 ----------------------------------------------------------------------------------------------------
|
53 |
+
2023-10-17 10:54:20,440 Training Params:
|
54 |
+
2023-10-17 10:54:20,440 - learning_rate: "5e-05"
|
55 |
+
2023-10-17 10:54:20,440 - mini_batch_size: "4"
|
56 |
+
2023-10-17 10:54:20,440 - max_epochs: "10"
|
57 |
+
2023-10-17 10:54:20,440 - shuffle: "True"
|
58 |
+
2023-10-17 10:54:20,440 ----------------------------------------------------------------------------------------------------
|
59 |
+
2023-10-17 10:54:20,440 Plugins:
|
60 |
+
2023-10-17 10:54:20,440 - TensorboardLogger
|
61 |
+
2023-10-17 10:54:20,440 - LinearScheduler | warmup_fraction: '0.1'
|
62 |
+
2023-10-17 10:54:20,440 ----------------------------------------------------------------------------------------------------
|
63 |
+
2023-10-17 10:54:20,440 Final evaluation on model from best epoch (best-model.pt)
|
64 |
+
2023-10-17 10:54:20,440 - metric: "('micro avg', 'f1-score')"
|
65 |
+
2023-10-17 10:54:20,440 ----------------------------------------------------------------------------------------------------
|
66 |
+
2023-10-17 10:54:20,440 Computation:
|
67 |
+
2023-10-17 10:54:20,440 - compute on device: cuda:0
|
68 |
+
2023-10-17 10:54:20,440 - embedding storage: none
|
69 |
+
2023-10-17 10:54:20,440 ----------------------------------------------------------------------------------------------------
|
70 |
+
2023-10-17 10:54:20,440 Model training base path: "hmbench-icdar/fr-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1"
|
71 |
+
2023-10-17 10:54:20,440 ----------------------------------------------------------------------------------------------------
|
72 |
+
2023-10-17 10:54:20,440 ----------------------------------------------------------------------------------------------------
|
73 |
+
2023-10-17 10:54:20,440 Logging anything other than scalars to TensorBoard is currently not supported.
|
74 |
+
2023-10-17 10:54:28,907 epoch 1 - iter 198/1984 - loss 1.62211158 - time (sec): 8.47 - samples/sec: 1911.31 - lr: 0.000005 - momentum: 0.000000
|
75 |
+
2023-10-17 10:54:37,933 epoch 1 - iter 396/1984 - loss 0.94619822 - time (sec): 17.49 - samples/sec: 1900.28 - lr: 0.000010 - momentum: 0.000000
|
76 |
+
2023-10-17 10:54:47,160 epoch 1 - iter 594/1984 - loss 0.70711697 - time (sec): 26.72 - samples/sec: 1857.08 - lr: 0.000015 - momentum: 0.000000
|
77 |
+
2023-10-17 10:54:56,036 epoch 1 - iter 792/1984 - loss 0.58185081 - time (sec): 35.59 - samples/sec: 1837.14 - lr: 0.000020 - momentum: 0.000000
|
78 |
+
2023-10-17 10:55:04,916 epoch 1 - iter 990/1984 - loss 0.49792196 - time (sec): 44.47 - samples/sec: 1840.07 - lr: 0.000025 - momentum: 0.000000
|
79 |
+
2023-10-17 10:55:14,026 epoch 1 - iter 1188/1984 - loss 0.43844628 - time (sec): 53.58 - samples/sec: 1840.99 - lr: 0.000030 - momentum: 0.000000
|
80 |
+
2023-10-17 10:55:22,978 epoch 1 - iter 1386/1984 - loss 0.39736182 - time (sec): 62.54 - samples/sec: 1835.63 - lr: 0.000035 - momentum: 0.000000
|
81 |
+
2023-10-17 10:55:32,299 epoch 1 - iter 1584/1984 - loss 0.36824360 - time (sec): 71.86 - samples/sec: 1827.19 - lr: 0.000040 - momentum: 0.000000
|
82 |
+
2023-10-17 10:55:41,543 epoch 1 - iter 1782/1984 - loss 0.34485394 - time (sec): 81.10 - samples/sec: 1815.57 - lr: 0.000045 - momentum: 0.000000
|
83 |
+
2023-10-17 10:55:50,632 epoch 1 - iter 1980/1984 - loss 0.33011193 - time (sec): 90.19 - samples/sec: 1814.56 - lr: 0.000050 - momentum: 0.000000
|
84 |
+
2023-10-17 10:55:50,815 ----------------------------------------------------------------------------------------------------
|
85 |
+
2023-10-17 10:55:50,815 EPOCH 1 done: loss 0.3305 - lr: 0.000050
|
86 |
+
2023-10-17 10:55:53,856 DEV : loss 0.3863277733325958 - f1-score (micro avg) 0.004
|
87 |
+
2023-10-17 10:55:53,876 saving best model
|
88 |
+
2023-10-17 10:55:54,244 ----------------------------------------------------------------------------------------------------
|
89 |
+
2023-10-17 10:56:03,257 epoch 2 - iter 198/1984 - loss 0.16096443 - time (sec): 9.01 - samples/sec: 1876.10 - lr: 0.000049 - momentum: 0.000000
|
90 |
+
2023-10-17 10:56:11,979 epoch 2 - iter 396/1984 - loss 0.14525930 - time (sec): 17.73 - samples/sec: 1867.53 - lr: 0.000049 - momentum: 0.000000
|
91 |
+
2023-10-17 10:56:20,750 epoch 2 - iter 594/1984 - loss 0.14335165 - time (sec): 26.50 - samples/sec: 1872.28 - lr: 0.000048 - momentum: 0.000000
|
92 |
+
2023-10-17 10:56:29,643 epoch 2 - iter 792/1984 - loss 0.13927613 - time (sec): 35.40 - samples/sec: 1877.08 - lr: 0.000048 - momentum: 0.000000
|
93 |
+
2023-10-17 10:56:38,576 epoch 2 - iter 990/1984 - loss 0.13757993 - time (sec): 44.33 - samples/sec: 1862.58 - lr: 0.000047 - momentum: 0.000000
|
94 |
+
2023-10-17 10:56:47,747 epoch 2 - iter 1188/1984 - loss 0.13290183 - time (sec): 53.50 - samples/sec: 1856.82 - lr: 0.000047 - momentum: 0.000000
|
95 |
+
2023-10-17 10:56:56,765 epoch 2 - iter 1386/1984 - loss 0.13087811 - time (sec): 62.52 - samples/sec: 1854.94 - lr: 0.000046 - momentum: 0.000000
|
96 |
+
2023-10-17 10:57:05,839 epoch 2 - iter 1584/1984 - loss 0.13141431 - time (sec): 71.59 - samples/sec: 1839.27 - lr: 0.000046 - momentum: 0.000000
|
97 |
+
2023-10-17 10:57:14,783 epoch 2 - iter 1782/1984 - loss 0.13018538 - time (sec): 80.54 - samples/sec: 1828.57 - lr: 0.000045 - momentum: 0.000000
|
98 |
+
2023-10-17 10:57:23,835 epoch 2 - iter 1980/1984 - loss 0.12957544 - time (sec): 89.59 - samples/sec: 1826.70 - lr: 0.000044 - momentum: 0.000000
|
99 |
+
2023-10-17 10:57:24,015 ----------------------------------------------------------------------------------------------------
|
100 |
+
2023-10-17 10:57:24,015 EPOCH 2 done: loss 0.1296 - lr: 0.000044
|
101 |
+
2023-10-17 10:57:27,871 DEV : loss 0.10983909666538239 - f1-score (micro avg) 0.7266
|
102 |
+
2023-10-17 10:57:27,892 saving best model
|
103 |
+
2023-10-17 10:57:28,372 ----------------------------------------------------------------------------------------------------
|
104 |
+
2023-10-17 10:57:37,508 epoch 3 - iter 198/1984 - loss 0.09127543 - time (sec): 9.13 - samples/sec: 1814.20 - lr: 0.000044 - momentum: 0.000000
|
105 |
+
2023-10-17 10:57:46,635 epoch 3 - iter 396/1984 - loss 0.09356135 - time (sec): 18.26 - samples/sec: 1799.61 - lr: 0.000043 - momentum: 0.000000
|
106 |
+
2023-10-17 10:57:55,783 epoch 3 - iter 594/1984 - loss 0.09391553 - time (sec): 27.41 - samples/sec: 1810.02 - lr: 0.000043 - momentum: 0.000000
|
107 |
+
2023-10-17 10:58:04,831 epoch 3 - iter 792/1984 - loss 0.09087245 - time (sec): 36.46 - samples/sec: 1821.93 - lr: 0.000042 - momentum: 0.000000
|
108 |
+
2023-10-17 10:58:13,828 epoch 3 - iter 990/1984 - loss 0.09252505 - time (sec): 45.45 - samples/sec: 1819.54 - lr: 0.000042 - momentum: 0.000000
|
109 |
+
2023-10-17 10:58:22,910 epoch 3 - iter 1188/1984 - loss 0.09198370 - time (sec): 54.54 - samples/sec: 1805.55 - lr: 0.000041 - momentum: 0.000000
|
110 |
+
2023-10-17 10:58:31,977 epoch 3 - iter 1386/1984 - loss 0.09210720 - time (sec): 63.60 - samples/sec: 1804.93 - lr: 0.000041 - momentum: 0.000000
|
111 |
+
2023-10-17 10:58:41,054 epoch 3 - iter 1584/1984 - loss 0.09115675 - time (sec): 72.68 - samples/sec: 1804.76 - lr: 0.000040 - momentum: 0.000000
|
112 |
+
2023-10-17 10:58:50,159 epoch 3 - iter 1782/1984 - loss 0.09117253 - time (sec): 81.79 - samples/sec: 1808.36 - lr: 0.000039 - momentum: 0.000000
|
113 |
+
2023-10-17 10:58:59,410 epoch 3 - iter 1980/1984 - loss 0.09250784 - time (sec): 91.04 - samples/sec: 1797.18 - lr: 0.000039 - momentum: 0.000000
|
114 |
+
2023-10-17 10:58:59,604 ----------------------------------------------------------------------------------------------------
|
115 |
+
2023-10-17 10:58:59,604 EPOCH 3 done: loss 0.0927 - lr: 0.000039
|
116 |
+
2023-10-17 10:59:03,046 DEV : loss 0.11815514415502548 - f1-score (micro avg) 0.7673
|
117 |
+
2023-10-17 10:59:03,068 saving best model
|
118 |
+
2023-10-17 10:59:03,547 ----------------------------------------------------------------------------------------------------
|
119 |
+
2023-10-17 10:59:12,752 epoch 4 - iter 198/1984 - loss 0.06981980 - time (sec): 9.20 - samples/sec: 1783.67 - lr: 0.000038 - momentum: 0.000000
|
120 |
+
2023-10-17 10:59:22,063 epoch 4 - iter 396/1984 - loss 0.07072403 - time (sec): 18.51 - samples/sec: 1844.43 - lr: 0.000038 - momentum: 0.000000
|
121 |
+
2023-10-17 10:59:31,256 epoch 4 - iter 594/1984 - loss 0.07503217 - time (sec): 27.71 - samples/sec: 1838.25 - lr: 0.000037 - momentum: 0.000000
|
122 |
+
2023-10-17 10:59:40,298 epoch 4 - iter 792/1984 - loss 0.07356156 - time (sec): 36.75 - samples/sec: 1835.12 - lr: 0.000037 - momentum: 0.000000
|
123 |
+
2023-10-17 10:59:49,285 epoch 4 - iter 990/1984 - loss 0.07317536 - time (sec): 45.74 - samples/sec: 1829.88 - lr: 0.000036 - momentum: 0.000000
|
124 |
+
2023-10-17 10:59:58,054 epoch 4 - iter 1188/1984 - loss 0.07519900 - time (sec): 54.51 - samples/sec: 1822.01 - lr: 0.000036 - momentum: 0.000000
|
125 |
+
2023-10-17 11:00:06,621 epoch 4 - iter 1386/1984 - loss 0.07351265 - time (sec): 63.07 - samples/sec: 1822.06 - lr: 0.000035 - momentum: 0.000000
|
126 |
+
2023-10-17 11:00:15,739 epoch 4 - iter 1584/1984 - loss 0.07234519 - time (sec): 72.19 - samples/sec: 1814.86 - lr: 0.000034 - momentum: 0.000000
|
127 |
+
2023-10-17 11:00:24,737 epoch 4 - iter 1782/1984 - loss 0.07317391 - time (sec): 81.19 - samples/sec: 1819.38 - lr: 0.000034 - momentum: 0.000000
|
128 |
+
2023-10-17 11:00:33,691 epoch 4 - iter 1980/1984 - loss 0.07222348 - time (sec): 90.14 - samples/sec: 1816.69 - lr: 0.000033 - momentum: 0.000000
|
129 |
+
2023-10-17 11:00:33,875 ----------------------------------------------------------------------------------------------------
|
130 |
+
2023-10-17 11:00:33,875 EPOCH 4 done: loss 0.0722 - lr: 0.000033
|
131 |
+
2023-10-17 11:00:37,278 DEV : loss 0.15266923606395721 - f1-score (micro avg) 0.7663
|
132 |
+
2023-10-17 11:00:37,300 ----------------------------------------------------------------------------------------------------
|
133 |
+
2023-10-17 11:00:46,355 epoch 5 - iter 198/1984 - loss 0.05046205 - time (sec): 9.05 - samples/sec: 1795.87 - lr: 0.000033 - momentum: 0.000000
|
134 |
+
2023-10-17 11:00:55,385 epoch 5 - iter 396/1984 - loss 0.05247464 - time (sec): 18.08 - samples/sec: 1852.85 - lr: 0.000032 - momentum: 0.000000
|
135 |
+
2023-10-17 11:01:04,492 epoch 5 - iter 594/1984 - loss 0.05489266 - time (sec): 27.19 - samples/sec: 1846.51 - lr: 0.000032 - momentum: 0.000000
|
136 |
+
2023-10-17 11:01:13,843 epoch 5 - iter 792/1984 - loss 0.05563933 - time (sec): 36.54 - samples/sec: 1853.29 - lr: 0.000031 - momentum: 0.000000
|
137 |
+
2023-10-17 11:01:23,006 epoch 5 - iter 990/1984 - loss 0.05335869 - time (sec): 45.70 - samples/sec: 1841.01 - lr: 0.000031 - momentum: 0.000000
|
138 |
+
2023-10-17 11:01:32,134 epoch 5 - iter 1188/1984 - loss 0.05269990 - time (sec): 54.83 - samples/sec: 1823.48 - lr: 0.000030 - momentum: 0.000000
|
139 |
+
2023-10-17 11:01:41,271 epoch 5 - iter 1386/1984 - loss 0.05371472 - time (sec): 63.97 - samples/sec: 1819.47 - lr: 0.000029 - momentum: 0.000000
|
140 |
+
2023-10-17 11:01:50,241 epoch 5 - iter 1584/1984 - loss 0.05415103 - time (sec): 72.94 - samples/sec: 1812.74 - lr: 0.000029 - momentum: 0.000000
|
141 |
+
2023-10-17 11:01:59,308 epoch 5 - iter 1782/1984 - loss 0.05436597 - time (sec): 82.01 - samples/sec: 1807.24 - lr: 0.000028 - momentum: 0.000000
|
142 |
+
2023-10-17 11:02:08,369 epoch 5 - iter 1980/1984 - loss 0.05389026 - time (sec): 91.07 - samples/sec: 1796.79 - lr: 0.000028 - momentum: 0.000000
|
143 |
+
2023-10-17 11:02:08,550 ----------------------------------------------------------------------------------------------------
|
144 |
+
2023-10-17 11:02:08,551 EPOCH 5 done: loss 0.0539 - lr: 0.000028
|
145 |
+
2023-10-17 11:02:11,912 DEV : loss 0.19919680058956146 - f1-score (micro avg) 0.7507
|
146 |
+
2023-10-17 11:02:11,932 ----------------------------------------------------------------------------------------------------
|
147 |
+
2023-10-17 11:02:21,160 epoch 6 - iter 198/1984 - loss 0.04620617 - time (sec): 9.23 - samples/sec: 1782.45 - lr: 0.000027 - momentum: 0.000000
|
148 |
+
2023-10-17 11:02:30,334 epoch 6 - iter 396/1984 - loss 0.04843778 - time (sec): 18.40 - samples/sec: 1774.97 - lr: 0.000027 - momentum: 0.000000
|
149 |
+
2023-10-17 11:02:39,513 epoch 6 - iter 594/1984 - loss 0.04484302 - time (sec): 27.58 - samples/sec: 1822.04 - lr: 0.000026 - momentum: 0.000000
|
150 |
+
2023-10-17 11:02:48,566 epoch 6 - iter 792/1984 - loss 0.04250884 - time (sec): 36.63 - samples/sec: 1818.99 - lr: 0.000026 - momentum: 0.000000
|
151 |
+
2023-10-17 11:02:57,657 epoch 6 - iter 990/1984 - loss 0.04152118 - time (sec): 45.72 - samples/sec: 1824.08 - lr: 0.000025 - momentum: 0.000000
|
152 |
+
2023-10-17 11:03:06,709 epoch 6 - iter 1188/1984 - loss 0.04219659 - time (sec): 54.78 - samples/sec: 1820.23 - lr: 0.000024 - momentum: 0.000000
|
153 |
+
2023-10-17 11:03:15,840 epoch 6 - iter 1386/1984 - loss 0.04406527 - time (sec): 63.91 - samples/sec: 1804.54 - lr: 0.000024 - momentum: 0.000000
|
154 |
+
2023-10-17 11:03:25,050 epoch 6 - iter 1584/1984 - loss 0.04345490 - time (sec): 73.12 - samples/sec: 1793.93 - lr: 0.000023 - momentum: 0.000000
|
155 |
+
2023-10-17 11:03:34,090 epoch 6 - iter 1782/1984 - loss 0.04365630 - time (sec): 82.16 - samples/sec: 1792.95 - lr: 0.000023 - momentum: 0.000000
|
156 |
+
2023-10-17 11:03:43,312 epoch 6 - iter 1980/1984 - loss 0.04303766 - time (sec): 91.38 - samples/sec: 1790.06 - lr: 0.000022 - momentum: 0.000000
|
157 |
+
2023-10-17 11:03:43,494 ----------------------------------------------------------------------------------------------------
|
158 |
+
2023-10-17 11:03:43,494 EPOCH 6 done: loss 0.0429 - lr: 0.000022
|
159 |
+
2023-10-17 11:03:46,982 DEV : loss 0.19526612758636475 - f1-score (micro avg) 0.759
|
160 |
+
2023-10-17 11:03:47,005 ----------------------------------------------------------------------------------------------------
|
161 |
+
2023-10-17 11:03:56,272 epoch 7 - iter 198/1984 - loss 0.02150771 - time (sec): 9.27 - samples/sec: 1758.81 - lr: 0.000022 - momentum: 0.000000
|
162 |
+
2023-10-17 11:04:05,509 epoch 7 - iter 396/1984 - loss 0.02251021 - time (sec): 18.50 - samples/sec: 1773.66 - lr: 0.000021 - momentum: 0.000000
|
163 |
+
2023-10-17 11:04:14,821 epoch 7 - iter 594/1984 - loss 0.02431276 - time (sec): 27.82 - samples/sec: 1781.84 - lr: 0.000021 - momentum: 0.000000
|
164 |
+
2023-10-17 11:04:23,989 epoch 7 - iter 792/1984 - loss 0.02510360 - time (sec): 36.98 - samples/sec: 1775.37 - lr: 0.000020 - momentum: 0.000000
|
165 |
+
2023-10-17 11:04:33,324 epoch 7 - iter 990/1984 - loss 0.02564549 - time (sec): 46.32 - samples/sec: 1769.45 - lr: 0.000019 - momentum: 0.000000
|
166 |
+
2023-10-17 11:04:42,517 epoch 7 - iter 1188/1984 - loss 0.02559000 - time (sec): 55.51 - samples/sec: 1773.24 - lr: 0.000019 - momentum: 0.000000
|
167 |
+
2023-10-17 11:04:51,705 epoch 7 - iter 1386/1984 - loss 0.02670608 - time (sec): 64.70 - samples/sec: 1773.77 - lr: 0.000018 - momentum: 0.000000
|
168 |
+
2023-10-17 11:05:00,894 epoch 7 - iter 1584/1984 - loss 0.02761748 - time (sec): 73.89 - samples/sec: 1763.66 - lr: 0.000018 - momentum: 0.000000
|
169 |
+
2023-10-17 11:05:10,013 epoch 7 - iter 1782/1984 - loss 0.02914139 - time (sec): 83.01 - samples/sec: 1776.01 - lr: 0.000017 - momentum: 0.000000
|
170 |
+
2023-10-17 11:05:19,040 epoch 7 - iter 1980/1984 - loss 0.02930128 - time (sec): 92.03 - samples/sec: 1778.63 - lr: 0.000017 - momentum: 0.000000
|
171 |
+
2023-10-17 11:05:19,223 ----------------------------------------------------------------------------------------------------
|
172 |
+
2023-10-17 11:05:19,223 EPOCH 7 done: loss 0.0292 - lr: 0.000017
|
173 |
+
2023-10-17 11:05:23,023 DEV : loss 0.2106274515390396 - f1-score (micro avg) 0.7652
|
174 |
+
2023-10-17 11:05:23,044 ----------------------------------------------------------------------------------------------------
|
175 |
+
2023-10-17 11:05:32,176 epoch 8 - iter 198/1984 - loss 0.01532135 - time (sec): 9.13 - samples/sec: 1792.67 - lr: 0.000016 - momentum: 0.000000
|
176 |
+
2023-10-17 11:05:41,290 epoch 8 - iter 396/1984 - loss 0.01844700 - time (sec): 18.24 - samples/sec: 1775.71 - lr: 0.000016 - momentum: 0.000000
|
177 |
+
2023-10-17 11:05:50,631 epoch 8 - iter 594/1984 - loss 0.01680914 - time (sec): 27.59 - samples/sec: 1813.29 - lr: 0.000015 - momentum: 0.000000
|
178 |
+
2023-10-17 11:05:59,572 epoch 8 - iter 792/1984 - loss 0.01532722 - time (sec): 36.53 - samples/sec: 1809.50 - lr: 0.000014 - momentum: 0.000000
|
179 |
+
2023-10-17 11:06:08,828 epoch 8 - iter 990/1984 - loss 0.01651685 - time (sec): 45.78 - samples/sec: 1817.17 - lr: 0.000014 - momentum: 0.000000
|
180 |
+
2023-10-17 11:06:17,849 epoch 8 - iter 1188/1984 - loss 0.01693879 - time (sec): 54.80 - samples/sec: 1823.06 - lr: 0.000013 - momentum: 0.000000
|
181 |
+
2023-10-17 11:06:26,896 epoch 8 - iter 1386/1984 - loss 0.01791079 - time (sec): 63.85 - samples/sec: 1810.51 - lr: 0.000013 - momentum: 0.000000
|
182 |
+
2023-10-17 11:06:35,832 epoch 8 - iter 1584/1984 - loss 0.01862913 - time (sec): 72.79 - samples/sec: 1794.24 - lr: 0.000012 - momentum: 0.000000
|
183 |
+
2023-10-17 11:06:44,860 epoch 8 - iter 1782/1984 - loss 0.01920153 - time (sec): 81.81 - samples/sec: 1799.31 - lr: 0.000012 - momentum: 0.000000
|
184 |
+
2023-10-17 11:06:53,979 epoch 8 - iter 1980/1984 - loss 0.01890437 - time (sec): 90.93 - samples/sec: 1799.51 - lr: 0.000011 - momentum: 0.000000
|
185 |
+
2023-10-17 11:06:54,163 ----------------------------------------------------------------------------------------------------
|
186 |
+
2023-10-17 11:06:54,163 EPOCH 8 done: loss 0.0189 - lr: 0.000011
|
187 |
+
2023-10-17 11:06:57,535 DEV : loss 0.25147587060928345 - f1-score (micro avg) 0.7642
|
188 |
+
2023-10-17 11:06:57,556 ----------------------------------------------------------------------------------------------------
|
189 |
+
2023-10-17 11:07:06,501 epoch 9 - iter 198/1984 - loss 0.01800551 - time (sec): 8.94 - samples/sec: 1771.15 - lr: 0.000011 - momentum: 0.000000
|
190 |
+
2023-10-17 11:07:15,377 epoch 9 - iter 396/1984 - loss 0.01378590 - time (sec): 17.82 - samples/sec: 1863.42 - lr: 0.000010 - momentum: 0.000000
|
191 |
+
2023-10-17 11:07:24,608 epoch 9 - iter 594/1984 - loss 0.01314436 - time (sec): 27.05 - samples/sec: 1863.71 - lr: 0.000009 - momentum: 0.000000
|
192 |
+
2023-10-17 11:07:33,694 epoch 9 - iter 792/1984 - loss 0.01248684 - time (sec): 36.14 - samples/sec: 1845.31 - lr: 0.000009 - momentum: 0.000000
|
193 |
+
2023-10-17 11:07:42,786 epoch 9 - iter 990/1984 - loss 0.01186486 - time (sec): 45.23 - samples/sec: 1825.45 - lr: 0.000008 - momentum: 0.000000
|
194 |
+
2023-10-17 11:07:51,933 epoch 9 - iter 1188/1984 - loss 0.01172450 - time (sec): 54.38 - samples/sec: 1815.57 - lr: 0.000008 - momentum: 0.000000
|
195 |
+
2023-10-17 11:08:01,131 epoch 9 - iter 1386/1984 - loss 0.01171225 - time (sec): 63.57 - samples/sec: 1805.34 - lr: 0.000007 - momentum: 0.000000
|
196 |
+
2023-10-17 11:08:10,295 epoch 9 - iter 1584/1984 - loss 0.01234481 - time (sec): 72.74 - samples/sec: 1808.45 - lr: 0.000007 - momentum: 0.000000
|
197 |
+
2023-10-17 11:08:19,267 epoch 9 - iter 1782/1984 - loss 0.01340571 - time (sec): 81.71 - samples/sec: 1807.81 - lr: 0.000006 - momentum: 0.000000
|
198 |
+
2023-10-17 11:08:28,340 epoch 9 - iter 1980/1984 - loss 0.01367155 - time (sec): 90.78 - samples/sec: 1803.06 - lr: 0.000006 - momentum: 0.000000
|
199 |
+
2023-10-17 11:08:28,512 ----------------------------------------------------------------------------------------------------
|
200 |
+
2023-10-17 11:08:28,512 EPOCH 9 done: loss 0.0136 - lr: 0.000006
|
201 |
+
2023-10-17 11:08:31,940 DEV : loss 0.23887786269187927 - f1-score (micro avg) 0.7643
|
202 |
+
2023-10-17 11:08:31,962 ----------------------------------------------------------------------------------------------------
|
203 |
+
2023-10-17 11:08:40,850 epoch 10 - iter 198/1984 - loss 0.00470077 - time (sec): 8.89 - samples/sec: 1858.08 - lr: 0.000005 - momentum: 0.000000
|
204 |
+
2023-10-17 11:08:49,392 epoch 10 - iter 396/1984 - loss 0.00422202 - time (sec): 17.43 - samples/sec: 1844.72 - lr: 0.000004 - momentum: 0.000000
|
205 |
+
2023-10-17 11:08:58,007 epoch 10 - iter 594/1984 - loss 0.00543694 - time (sec): 26.04 - samples/sec: 1866.92 - lr: 0.000004 - momentum: 0.000000
|
206 |
+
2023-10-17 11:09:06,736 epoch 10 - iter 792/1984 - loss 0.00646607 - time (sec): 34.77 - samples/sec: 1863.80 - lr: 0.000003 - momentum: 0.000000
|
207 |
+
2023-10-17 11:09:15,340 epoch 10 - iter 990/1984 - loss 0.00650601 - time (sec): 43.38 - samples/sec: 1857.52 - lr: 0.000003 - momentum: 0.000000
|
208 |
+
2023-10-17 11:09:23,989 epoch 10 - iter 1188/1984 - loss 0.00670870 - time (sec): 52.03 - samples/sec: 1873.09 - lr: 0.000002 - momentum: 0.000000
|
209 |
+
2023-10-17 11:09:32,785 epoch 10 - iter 1386/1984 - loss 0.00698514 - time (sec): 60.82 - samples/sec: 1884.69 - lr: 0.000002 - momentum: 0.000000
|
210 |
+
2023-10-17 11:09:41,466 epoch 10 - iter 1584/1984 - loss 0.00711906 - time (sec): 69.50 - samples/sec: 1886.06 - lr: 0.000001 - momentum: 0.000000
|
211 |
+
2023-10-17 11:09:50,532 epoch 10 - iter 1782/1984 - loss 0.00702130 - time (sec): 78.57 - samples/sec: 1870.40 - lr: 0.000001 - momentum: 0.000000
|
212 |
+
2023-10-17 11:09:59,828 epoch 10 - iter 1980/1984 - loss 0.00764656 - time (sec): 87.86 - samples/sec: 1863.19 - lr: 0.000000 - momentum: 0.000000
|
213 |
+
2023-10-17 11:10:00,016 ----------------------------------------------------------------------------------------------------
|
214 |
+
2023-10-17 11:10:00,016 EPOCH 10 done: loss 0.0076 - lr: 0.000000
|
215 |
+
2023-10-17 11:10:03,438 DEV : loss 0.2601605951786041 - f1-score (micro avg) 0.7684
|
216 |
+
2023-10-17 11:10:03,459 saving best model
|
217 |
+
2023-10-17 11:10:04,382 ----------------------------------------------------------------------------------------------------
|
218 |
+
2023-10-17 11:10:04,383 Loading model from best epoch ...
|
219 |
+
2023-10-17 11:10:05,772 SequenceTagger predicts: Dictionary with 13 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG
|
220 |
+
2023-10-17 11:10:08,979
|
221 |
+
Results:
|
222 |
+
- F-score (micro) 0.7792
|
223 |
+
- F-score (macro) 0.6974
|
224 |
+
- Accuracy 0.6658
|
225 |
+
|
226 |
+
By class:
|
227 |
+
precision recall f1-score support
|
228 |
+
|
229 |
+
LOC 0.8303 0.8443 0.8372 655
|
230 |
+
PER 0.7028 0.7848 0.7415 223
|
231 |
+
ORG 0.6000 0.4488 0.5135 127
|
232 |
+
|
233 |
+
micro avg 0.7772 0.7811 0.7792 1005
|
234 |
+
macro avg 0.7110 0.6926 0.6974 1005
|
235 |
+
weighted avg 0.7729 0.7811 0.7751 1005
|
236 |
+
|
237 |
+
2023-10-17 11:10:08,979 ----------------------------------------------------------------------------------------------------
|