haryoaw commited on
Commit
31cec78
1 Parent(s): b4dbe69

Initial Commit

Browse files
Files changed (5) hide show
  1. README.md +169 -0
  2. config.json +46 -0
  3. eval_result_ner.json +1 -0
  4. model.safetensors +3 -0
  5. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,169 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: mit
4
+ base_model: FacebookAI/xlm-roberta-base
5
+ tags:
6
+ - generated_from_trainer
7
+ metrics:
8
+ - precision
9
+ - recall
10
+ - f1
11
+ - accuracy
12
+ model-index:
13
+ - name: scenario-non-kd-scr-ner-full-xlmr_data-univner_full66
14
+ results: []
15
+ ---
16
+
17
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
18
+ should probably proofread and complete it, then remove this comment. -->
19
+
20
+ # scenario-non-kd-scr-ner-full-xlmr_data-univner_full66
21
+
22
+ This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
23
+ It achieves the following results on the evaluation set:
24
+ - Loss: 0.3820
25
+ - Precision: 0.5809
26
+ - Recall: 0.5858
27
+ - F1: 0.5833
28
+ - Accuracy: 0.9601
29
+
30
+ ## Model description
31
+
32
+ More information needed
33
+
34
+ ## Intended uses & limitations
35
+
36
+ More information needed
37
+
38
+ ## Training and evaluation data
39
+
40
+ More information needed
41
+
42
+ ## Training procedure
43
+
44
+ ### Training hyperparameters
45
+
46
+ The following hyperparameters were used during training:
47
+ - learning_rate: 3e-05
48
+ - train_batch_size: 32
49
+ - eval_batch_size: 32
50
+ - seed: 66
51
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
+ - lr_scheduler_type: linear
53
+ - num_epochs: 30
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
+ |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
+ | 0.34 | 0.2910 | 500 | 0.2769 | 0.3871 | 0.1554 | 0.2218 | 0.9300 |
60
+ | 0.262 | 0.5821 | 1000 | 0.2652 | 0.4057 | 0.1607 | 0.2302 | 0.9331 |
61
+ | 0.2326 | 0.8731 | 1500 | 0.2329 | 0.3531 | 0.2306 | 0.2790 | 0.9365 |
62
+ | 0.2066 | 1.1641 | 2000 | 0.2247 | 0.3635 | 0.2685 | 0.3089 | 0.9390 |
63
+ | 0.184 | 1.4552 | 2500 | 0.2202 | 0.3699 | 0.3363 | 0.3523 | 0.9406 |
64
+ | 0.1791 | 1.7462 | 3000 | 0.1999 | 0.4042 | 0.3679 | 0.3852 | 0.9427 |
65
+ | 0.1649 | 2.0373 | 3500 | 0.2060 | 0.3968 | 0.3718 | 0.3839 | 0.9433 |
66
+ | 0.1415 | 2.3283 | 4000 | 0.2008 | 0.4114 | 0.3821 | 0.3962 | 0.9437 |
67
+ | 0.1371 | 2.6193 | 4500 | 0.1949 | 0.4085 | 0.4199 | 0.4141 | 0.9456 |
68
+ | 0.1305 | 2.9104 | 5000 | 0.1824 | 0.4429 | 0.4344 | 0.4386 | 0.9480 |
69
+ | 0.1052 | 3.2014 | 5500 | 0.1943 | 0.4586 | 0.4768 | 0.4676 | 0.9493 |
70
+ | 0.0953 | 3.4924 | 6000 | 0.1851 | 0.4925 | 0.4718 | 0.4819 | 0.9518 |
71
+ | 0.0886 | 3.7835 | 6500 | 0.1959 | 0.5058 | 0.4686 | 0.4865 | 0.9527 |
72
+ | 0.0831 | 4.0745 | 7000 | 0.1918 | 0.5094 | 0.4719 | 0.4900 | 0.9526 |
73
+ | 0.0635 | 4.3655 | 7500 | 0.1910 | 0.5111 | 0.4933 | 0.5020 | 0.9542 |
74
+ | 0.064 | 4.6566 | 8000 | 0.1974 | 0.4897 | 0.5436 | 0.5153 | 0.9537 |
75
+ | 0.0631 | 4.9476 | 8500 | 0.1992 | 0.5019 | 0.5278 | 0.5145 | 0.9546 |
76
+ | 0.0481 | 5.2386 | 9000 | 0.2012 | 0.5285 | 0.5311 | 0.5298 | 0.9562 |
77
+ | 0.0456 | 5.5297 | 9500 | 0.1995 | 0.5149 | 0.5315 | 0.5231 | 0.9552 |
78
+ | 0.0438 | 5.8207 | 10000 | 0.1989 | 0.5366 | 0.5302 | 0.5334 | 0.9558 |
79
+ | 0.0406 | 6.1118 | 10500 | 0.2214 | 0.5217 | 0.5162 | 0.5190 | 0.9553 |
80
+ | 0.034 | 6.4028 | 11000 | 0.2221 | 0.5344 | 0.5373 | 0.5358 | 0.9567 |
81
+ | 0.0308 | 6.6938 | 11500 | 0.2225 | 0.5279 | 0.5483 | 0.5379 | 0.9568 |
82
+ | 0.0336 | 6.9849 | 12000 | 0.2288 | 0.5269 | 0.5321 | 0.5295 | 0.9563 |
83
+ | 0.0225 | 7.2759 | 12500 | 0.2539 | 0.5378 | 0.5220 | 0.5298 | 0.9564 |
84
+ | 0.0234 | 7.5669 | 13000 | 0.2340 | 0.5374 | 0.5465 | 0.5419 | 0.9571 |
85
+ | 0.0246 | 7.8580 | 13500 | 0.2355 | 0.5212 | 0.5757 | 0.5471 | 0.9560 |
86
+ | 0.0206 | 8.1490 | 14000 | 0.2446 | 0.5437 | 0.5638 | 0.5536 | 0.9571 |
87
+ | 0.0171 | 8.4400 | 14500 | 0.2516 | 0.5265 | 0.5752 | 0.5498 | 0.9563 |
88
+ | 0.018 | 8.7311 | 15000 | 0.2469 | 0.5417 | 0.5487 | 0.5452 | 0.9569 |
89
+ | 0.018 | 9.0221 | 15500 | 0.2513 | 0.5410 | 0.5575 | 0.5491 | 0.9575 |
90
+ | 0.0136 | 9.3132 | 16000 | 0.2708 | 0.5230 | 0.5763 | 0.5484 | 0.9571 |
91
+ | 0.0136 | 9.6042 | 16500 | 0.2635 | 0.5683 | 0.5460 | 0.5569 | 0.9584 |
92
+ | 0.0137 | 9.8952 | 17000 | 0.2587 | 0.5343 | 0.5812 | 0.5567 | 0.9569 |
93
+ | 0.0115 | 10.1863 | 17500 | 0.2885 | 0.5519 | 0.5653 | 0.5585 | 0.9587 |
94
+ | 0.0106 | 10.4773 | 18000 | 0.2769 | 0.5681 | 0.5517 | 0.5598 | 0.9580 |
95
+ | 0.0109 | 10.7683 | 18500 | 0.2836 | 0.5461 | 0.5654 | 0.5556 | 0.9578 |
96
+ | 0.0106 | 11.0594 | 19000 | 0.2841 | 0.5425 | 0.5774 | 0.5594 | 0.9575 |
97
+ | 0.0082 | 11.3504 | 19500 | 0.2925 | 0.5424 | 0.5582 | 0.5502 | 0.9573 |
98
+ | 0.0087 | 11.6414 | 20000 | 0.2806 | 0.5493 | 0.5763 | 0.5625 | 0.9580 |
99
+ | 0.0086 | 11.9325 | 20500 | 0.2804 | 0.5515 | 0.5651 | 0.5582 | 0.9580 |
100
+ | 0.0071 | 12.2235 | 21000 | 0.3015 | 0.5552 | 0.5705 | 0.5627 | 0.9584 |
101
+ | 0.007 | 12.5146 | 21500 | 0.2978 | 0.5362 | 0.5819 | 0.5581 | 0.9577 |
102
+ | 0.0068 | 12.8056 | 22000 | 0.3067 | 0.5625 | 0.5576 | 0.5601 | 0.9580 |
103
+ | 0.0062 | 13.0966 | 22500 | 0.3102 | 0.5730 | 0.5599 | 0.5664 | 0.9589 |
104
+ | 0.0055 | 13.3877 | 23000 | 0.3033 | 0.5454 | 0.5921 | 0.5678 | 0.9586 |
105
+ | 0.0054 | 13.6787 | 23500 | 0.3034 | 0.5593 | 0.5576 | 0.5584 | 0.9583 |
106
+ | 0.0056 | 13.9697 | 24000 | 0.3187 | 0.5604 | 0.5612 | 0.5608 | 0.9583 |
107
+ | 0.0042 | 14.2608 | 24500 | 0.3102 | 0.5450 | 0.5835 | 0.5636 | 0.9585 |
108
+ | 0.0046 | 14.5518 | 25000 | 0.3209 | 0.5698 | 0.5711 | 0.5704 | 0.9591 |
109
+ | 0.0048 | 14.8428 | 25500 | 0.3407 | 0.5631 | 0.5397 | 0.5512 | 0.9585 |
110
+ | 0.004 | 15.1339 | 26000 | 0.3197 | 0.5740 | 0.5594 | 0.5666 | 0.9592 |
111
+ | 0.0039 | 15.4249 | 26500 | 0.3220 | 0.5440 | 0.5990 | 0.5702 | 0.9587 |
112
+ | 0.0038 | 15.7159 | 27000 | 0.3225 | 0.5630 | 0.5631 | 0.5631 | 0.9587 |
113
+ | 0.0039 | 16.0070 | 27500 | 0.3288 | 0.5788 | 0.5562 | 0.5673 | 0.9593 |
114
+ | 0.0028 | 16.2980 | 28000 | 0.3311 | 0.5698 | 0.5692 | 0.5695 | 0.9595 |
115
+ | 0.0034 | 16.5891 | 28500 | 0.3389 | 0.5937 | 0.5438 | 0.5677 | 0.9594 |
116
+ | 0.0032 | 16.8801 | 29000 | 0.3249 | 0.5422 | 0.6142 | 0.5760 | 0.9589 |
117
+ | 0.0027 | 17.1711 | 29500 | 0.3392 | 0.5613 | 0.5711 | 0.5662 | 0.9589 |
118
+ | 0.0029 | 17.4622 | 30000 | 0.3180 | 0.5611 | 0.5900 | 0.5751 | 0.9585 |
119
+ | 0.0025 | 17.7532 | 30500 | 0.3388 | 0.5599 | 0.5706 | 0.5652 | 0.9586 |
120
+ | 0.0027 | 18.0442 | 31000 | 0.3437 | 0.5580 | 0.5861 | 0.5717 | 0.9593 |
121
+ | 0.002 | 18.3353 | 31500 | 0.3504 | 0.5648 | 0.5702 | 0.5675 | 0.9592 |
122
+ | 0.0022 | 18.6263 | 32000 | 0.3490 | 0.5535 | 0.5825 | 0.5676 | 0.9592 |
123
+ | 0.0019 | 18.9173 | 32500 | 0.3495 | 0.5703 | 0.5605 | 0.5654 | 0.9590 |
124
+ | 0.0018 | 19.2084 | 33000 | 0.3542 | 0.5714 | 0.5715 | 0.5714 | 0.9597 |
125
+ | 0.0021 | 19.4994 | 33500 | 0.3521 | 0.5493 | 0.5923 | 0.5700 | 0.9590 |
126
+ | 0.0016 | 19.7905 | 34000 | 0.3561 | 0.5556 | 0.5718 | 0.5636 | 0.9588 |
127
+ | 0.0017 | 20.0815 | 34500 | 0.3437 | 0.5460 | 0.6053 | 0.5741 | 0.9579 |
128
+ | 0.0015 | 20.3725 | 35000 | 0.3630 | 0.5679 | 0.5813 | 0.5745 | 0.9597 |
129
+ | 0.0017 | 20.6636 | 35500 | 0.3583 | 0.5742 | 0.5767 | 0.5754 | 0.9592 |
130
+ | 0.0015 | 20.9546 | 36000 | 0.3575 | 0.5774 | 0.5683 | 0.5728 | 0.9596 |
131
+ | 0.0013 | 21.2456 | 36500 | 0.3620 | 0.5588 | 0.5680 | 0.5634 | 0.9590 |
132
+ | 0.0012 | 21.5367 | 37000 | 0.3587 | 0.5657 | 0.5845 | 0.5749 | 0.9592 |
133
+ | 0.0012 | 21.8277 | 37500 | 0.3605 | 0.5531 | 0.5953 | 0.5734 | 0.9588 |
134
+ | 0.001 | 22.1187 | 38000 | 0.3645 | 0.5801 | 0.5682 | 0.5741 | 0.9595 |
135
+ | 0.0011 | 22.4098 | 38500 | 0.3617 | 0.5528 | 0.6022 | 0.5765 | 0.9590 |
136
+ | 0.0011 | 22.7008 | 39000 | 0.3671 | 0.5752 | 0.5826 | 0.5789 | 0.9595 |
137
+ | 0.0013 | 22.9919 | 39500 | 0.3622 | 0.5577 | 0.5859 | 0.5714 | 0.9590 |
138
+ | 0.0008 | 23.2829 | 40000 | 0.3748 | 0.5666 | 0.5739 | 0.5702 | 0.9592 |
139
+ | 0.001 | 23.5739 | 40500 | 0.3616 | 0.5653 | 0.5874 | 0.5761 | 0.9593 |
140
+ | 0.0009 | 23.8650 | 41000 | 0.3691 | 0.5644 | 0.5901 | 0.5770 | 0.9593 |
141
+ | 0.0008 | 24.1560 | 41500 | 0.3630 | 0.5673 | 0.5892 | 0.5781 | 0.9594 |
142
+ | 0.0007 | 24.4470 | 42000 | 0.3613 | 0.5645 | 0.5889 | 0.5765 | 0.9592 |
143
+ | 0.0007 | 24.7381 | 42500 | 0.3723 | 0.5636 | 0.5788 | 0.5711 | 0.9595 |
144
+ | 0.0007 | 25.0291 | 43000 | 0.3728 | 0.5786 | 0.5725 | 0.5755 | 0.9594 |
145
+ | 0.0005 | 25.3201 | 43500 | 0.3715 | 0.5707 | 0.5764 | 0.5735 | 0.9594 |
146
+ | 0.0007 | 25.6112 | 44000 | 0.3723 | 0.5657 | 0.5728 | 0.5692 | 0.9591 |
147
+ | 0.0007 | 25.9022 | 44500 | 0.3739 | 0.5980 | 0.5605 | 0.5786 | 0.9599 |
148
+ | 0.0005 | 26.1932 | 45000 | 0.3736 | 0.5862 | 0.5748 | 0.5805 | 0.9595 |
149
+ | 0.0005 | 26.4843 | 45500 | 0.3863 | 0.5784 | 0.5806 | 0.5795 | 0.9597 |
150
+ | 0.0005 | 26.7753 | 46000 | 0.3847 | 0.5956 | 0.5625 | 0.5786 | 0.9600 |
151
+ | 0.0006 | 27.0664 | 46500 | 0.3769 | 0.5831 | 0.5836 | 0.5834 | 0.9599 |
152
+ | 0.0005 | 27.3574 | 47000 | 0.3823 | 0.5931 | 0.5617 | 0.5770 | 0.9598 |
153
+ | 0.0004 | 27.6484 | 47500 | 0.3755 | 0.5733 | 0.5905 | 0.5818 | 0.9596 |
154
+ | 0.0004 | 27.9395 | 48000 | 0.3819 | 0.5950 | 0.5696 | 0.5820 | 0.9599 |
155
+ | 0.0003 | 28.2305 | 48500 | 0.3820 | 0.5851 | 0.5800 | 0.5825 | 0.9601 |
156
+ | 0.0003 | 28.5215 | 49000 | 0.3846 | 0.5907 | 0.5748 | 0.5826 | 0.9600 |
157
+ | 0.0004 | 28.8126 | 49500 | 0.3833 | 0.5840 | 0.5825 | 0.5832 | 0.9601 |
158
+ | 0.0003 | 29.1036 | 50000 | 0.3827 | 0.5786 | 0.5862 | 0.5824 | 0.9600 |
159
+ | 0.0003 | 29.3946 | 50500 | 0.3808 | 0.5793 | 0.5871 | 0.5832 | 0.9600 |
160
+ | 0.0003 | 29.6857 | 51000 | 0.3814 | 0.5805 | 0.5868 | 0.5836 | 0.9601 |
161
+ | 0.0004 | 29.9767 | 51500 | 0.3820 | 0.5809 | 0.5858 | 0.5833 | 0.9601 |
162
+
163
+
164
+ ### Framework versions
165
+
166
+ - Transformers 4.44.2
167
+ - Pytorch 2.1.1+cu121
168
+ - Datasets 2.14.5
169
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "FacebookAI/xlm-roberta-base",
3
+ "architectures": [
4
+ "XLMRobertaForTokenClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "gelu",
11
+ "hidden_dropout_prob": 0.1,
12
+ "hidden_size": 768,
13
+ "id2label": {
14
+ "0": "LABEL_0",
15
+ "1": "LABEL_1",
16
+ "2": "LABEL_2",
17
+ "3": "LABEL_3",
18
+ "4": "LABEL_4",
19
+ "5": "LABEL_5",
20
+ "6": "LABEL_6"
21
+ },
22
+ "initializer_range": 0.02,
23
+ "intermediate_size": 3072,
24
+ "label2id": {
25
+ "LABEL_0": 0,
26
+ "LABEL_1": 1,
27
+ "LABEL_2": 2,
28
+ "LABEL_3": 3,
29
+ "LABEL_4": 4,
30
+ "LABEL_5": 5,
31
+ "LABEL_6": 6
32
+ },
33
+ "layer_norm_eps": 1e-05,
34
+ "max_position_embeddings": 514,
35
+ "model_type": "xlm-roberta",
36
+ "num_attention_heads": 12,
37
+ "num_hidden_layers": 6,
38
+ "output_past": true,
39
+ "pad_token_id": 1,
40
+ "position_embedding_type": "absolute",
41
+ "torch_dtype": "float32",
42
+ "transformers_version": "4.44.2",
43
+ "type_vocab_size": 1,
44
+ "use_cache": true,
45
+ "vocab_size": 250002
46
+ }
eval_result_ner.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"ceb_gja": {"precision": 0.3146067415730337, "recall": 0.5714285714285714, "f1": 0.4057971014492754, "accuracy": 0.9359073359073359}, "en_pud": {"precision": 0.4729011689691817, "recall": 0.413953488372093, "f1": 0.4414682539682539, "accuracy": 0.9487155270117114}, "de_pud": {"precision": 0.12669126691266913, "recall": 0.29740134744947067, "f1": 0.17768832662449685, "accuracy": 0.8449205381838638}, "pt_pud": {"precision": 0.5472547254725473, "recall": 0.5532302092811647, "f1": 0.5502262443438914, "accuracy": 0.960097406758662}, "ru_pud": {"precision": 0.018953323903818955, "recall": 0.06467181467181467, "f1": 0.029315248304528547, "accuracy": 0.6805476621028158}, "sv_pud": {"precision": 0.5162241887905604, "recall": 0.3401360544217687, "f1": 0.41007615700058575, "accuracy": 0.9478402180750681}, "tl_trg": {"precision": 0.21568627450980393, "recall": 0.4782608695652174, "f1": 0.2972972972972973, "accuracy": 0.9400544959128065}, "tl_ugnayan": {"precision": 0.0, "recall": 0.0, "f1": 0.0, "accuracy": 0.8997265268915223}, "zh_gsd": {"precision": 0.49029754204398446, "recall": 0.4941329856584094, "f1": 0.4922077922077922, "accuracy": 0.9301531801531802}, "zh_gsdsimp": {"precision": 0.45966709346991036, "recall": 0.4705111402359109, "f1": 0.4650259067357513, "accuracy": 0.9294039294039294}, "hr_set": {"precision": 0.724113475177305, "recall": 0.7277263007840342, "f1": 0.7259153928190543, "accuracy": 0.9687963726298434}, "da_ddt": {"precision": 0.6175771971496437, "recall": 0.5816554809843401, "f1": 0.5990783410138248, "accuracy": 0.970767235358675}, "en_ewt": {"precision": 0.6004296455424275, "recall": 0.5137867647058824, "f1": 0.5537394749876177, "accuracy": 0.9587998565565605}, "pt_bosque": {"precision": 0.5860979462875198, "recall": 0.6106995884773663, "f1": 0.5981459089076986, "accuracy": 0.9654760179684104}, "sr_set": {"precision": 0.7671394799054374, "recall": 0.7662337662337663, "f1": 0.7666863555818075, "accuracy": 0.9664652832501532}, "sk_snk": {"precision": 0.397341211225997, "recall": 0.2939890710382514, "f1": 0.3379396984924623, "accuracy": 0.9176350502512562}, "sv_talbanken": {"precision": 0.7065868263473054, "recall": 0.6020408163265306, "f1": 0.650137741046832, "accuracy": 0.9940619325710359}}
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a2a4de204f547d99ffc718d2efbd438cd9185bd268da7ec2fcd13ca9604c8c6a
3
+ size 939737140
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5280869f6fb66192c2598bdbe1a99e44c217e52f661a4f0d003c48a6fbae37cf
3
+ size 5304