haryoaw commited on
Commit
c6fdd64
1 Parent(s): c54e332

Initial Commit

Browse files
Files changed (5) hide show
  1. README.md +169 -0
  2. config.json +46 -0
  3. eval_result_ner.json +1 -0
  4. model.safetensors +3 -0
  5. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,169 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: mit
4
+ base_model: FacebookAI/xlm-roberta-base
5
+ tags:
6
+ - generated_from_trainer
7
+ metrics:
8
+ - precision
9
+ - recall
10
+ - f1
11
+ - accuracy
12
+ model-index:
13
+ - name: scenario-non-kd-scr-ner-full-xlmr_data-univner_full44
14
+ results: []
15
+ ---
16
+
17
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
18
+ should probably proofread and complete it, then remove this comment. -->
19
+
20
+ # scenario-non-kd-scr-ner-full-xlmr_data-univner_full44
21
+
22
+ This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
23
+ It achieves the following results on the evaluation set:
24
+ - Loss: 0.3854
25
+ - Precision: 0.5785
26
+ - Recall: 0.5835
27
+ - F1: 0.5810
28
+ - Accuracy: 0.9605
29
+
30
+ ## Model description
31
+
32
+ More information needed
33
+
34
+ ## Intended uses & limitations
35
+
36
+ More information needed
37
+
38
+ ## Training and evaluation data
39
+
40
+ More information needed
41
+
42
+ ## Training procedure
43
+
44
+ ### Training hyperparameters
45
+
46
+ The following hyperparameters were used during training:
47
+ - learning_rate: 3e-05
48
+ - train_batch_size: 32
49
+ - eval_batch_size: 32
50
+ - seed: 44
51
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
+ - lr_scheduler_type: linear
53
+ - num_epochs: 30
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
+ |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
+ | 0.338 | 0.2910 | 500 | 0.2757 | 0.375 | 0.1450 | 0.2091 | 0.9304 |
60
+ | 0.2636 | 0.5821 | 1000 | 0.2477 | 0.2864 | 0.2401 | 0.2612 | 0.9323 |
61
+ | 0.2294 | 0.8731 | 1500 | 0.2256 | 0.3407 | 0.2946 | 0.3160 | 0.9369 |
62
+ | 0.2017 | 1.1641 | 2000 | 0.2149 | 0.3468 | 0.3505 | 0.3486 | 0.9379 |
63
+ | 0.1872 | 1.4552 | 2500 | 0.2112 | 0.3721 | 0.3305 | 0.3501 | 0.9387 |
64
+ | 0.1753 | 1.7462 | 3000 | 0.2087 | 0.3904 | 0.3083 | 0.3445 | 0.9423 |
65
+ | 0.1623 | 2.0373 | 3500 | 0.1986 | 0.3818 | 0.3929 | 0.3873 | 0.9417 |
66
+ | 0.1324 | 2.3283 | 4000 | 0.1986 | 0.4314 | 0.4066 | 0.4186 | 0.9458 |
67
+ | 0.1307 | 2.6193 | 4500 | 0.1901 | 0.4309 | 0.4376 | 0.4342 | 0.9474 |
68
+ | 0.1199 | 2.9104 | 5000 | 0.1789 | 0.4454 | 0.4336 | 0.4394 | 0.9485 |
69
+ | 0.0995 | 3.2014 | 5500 | 0.1831 | 0.4761 | 0.4709 | 0.4735 | 0.9514 |
70
+ | 0.0895 | 3.4924 | 6000 | 0.1899 | 0.4646 | 0.5120 | 0.4872 | 0.9515 |
71
+ | 0.0827 | 3.7835 | 6500 | 0.1757 | 0.5053 | 0.5158 | 0.5105 | 0.9535 |
72
+ | 0.0761 | 4.0745 | 7000 | 0.2069 | 0.5092 | 0.4826 | 0.4956 | 0.9539 |
73
+ | 0.0616 | 4.3655 | 7500 | 0.1927 | 0.5246 | 0.5012 | 0.5127 | 0.9548 |
74
+ | 0.0609 | 4.6566 | 8000 | 0.1891 | 0.5251 | 0.4911 | 0.5076 | 0.9547 |
75
+ | 0.0577 | 4.9476 | 8500 | 0.1873 | 0.4978 | 0.5341 | 0.5153 | 0.9546 |
76
+ | 0.0451 | 5.2386 | 9000 | 0.2040 | 0.5239 | 0.5302 | 0.5270 | 0.9555 |
77
+ | 0.0419 | 5.5297 | 9500 | 0.2065 | 0.5249 | 0.5195 | 0.5222 | 0.9555 |
78
+ | 0.0433 | 5.8207 | 10000 | 0.2139 | 0.5171 | 0.5493 | 0.5327 | 0.9556 |
79
+ | 0.0392 | 6.1118 | 10500 | 0.2184 | 0.5140 | 0.5578 | 0.5350 | 0.9557 |
80
+ | 0.0308 | 6.4028 | 11000 | 0.2159 | 0.5110 | 0.5582 | 0.5335 | 0.9559 |
81
+ | 0.0312 | 6.6938 | 11500 | 0.2202 | 0.4900 | 0.5969 | 0.5382 | 0.9541 |
82
+ | 0.0296 | 6.9849 | 12000 | 0.2288 | 0.5260 | 0.5260 | 0.5260 | 0.9567 |
83
+ | 0.024 | 7.2759 | 12500 | 0.2368 | 0.5330 | 0.5667 | 0.5493 | 0.9572 |
84
+ | 0.0232 | 7.5669 | 13000 | 0.2438 | 0.5247 | 0.5399 | 0.5322 | 0.9565 |
85
+ | 0.0222 | 7.8580 | 13500 | 0.2483 | 0.5643 | 0.5266 | 0.5448 | 0.9573 |
86
+ | 0.0187 | 8.1490 | 14000 | 0.2476 | 0.5615 | 0.5333 | 0.5470 | 0.9567 |
87
+ | 0.0176 | 8.4400 | 14500 | 0.2473 | 0.5494 | 0.5445 | 0.5470 | 0.9571 |
88
+ | 0.0176 | 8.7311 | 15000 | 0.2452 | 0.5346 | 0.5715 | 0.5524 | 0.9570 |
89
+ | 0.0159 | 9.0221 | 15500 | 0.2794 | 0.5324 | 0.5569 | 0.5444 | 0.9577 |
90
+ | 0.0131 | 9.3132 | 16000 | 0.2672 | 0.5424 | 0.5917 | 0.5660 | 0.9582 |
91
+ | 0.0132 | 9.6042 | 16500 | 0.2716 | 0.5287 | 0.5566 | 0.5423 | 0.9572 |
92
+ | 0.0133 | 9.8952 | 17000 | 0.2668 | 0.5267 | 0.5669 | 0.5461 | 0.9569 |
93
+ | 0.0113 | 10.1863 | 17500 | 0.2779 | 0.5369 | 0.5770 | 0.5562 | 0.9578 |
94
+ | 0.0094 | 10.4773 | 18000 | 0.2717 | 0.5380 | 0.5881 | 0.5619 | 0.9578 |
95
+ | 0.0111 | 10.7683 | 18500 | 0.2861 | 0.5582 | 0.5465 | 0.5523 | 0.9587 |
96
+ | 0.0094 | 11.0594 | 19000 | 0.2803 | 0.5365 | 0.5833 | 0.5589 | 0.9582 |
97
+ | 0.008 | 11.3504 | 19500 | 0.2853 | 0.5262 | 0.5755 | 0.5498 | 0.9575 |
98
+ | 0.0077 | 11.6414 | 20000 | 0.2893 | 0.5366 | 0.5806 | 0.5577 | 0.9579 |
99
+ | 0.0083 | 11.9325 | 20500 | 0.2898 | 0.5415 | 0.5923 | 0.5657 | 0.9584 |
100
+ | 0.0067 | 12.2235 | 21000 | 0.3000 | 0.5635 | 0.5419 | 0.5525 | 0.9582 |
101
+ | 0.0066 | 12.5146 | 21500 | 0.3046 | 0.5574 | 0.5643 | 0.5608 | 0.9587 |
102
+ | 0.0065 | 12.8056 | 22000 | 0.3063 | 0.5495 | 0.5748 | 0.5619 | 0.9587 |
103
+ | 0.0062 | 13.0966 | 22500 | 0.3147 | 0.5619 | 0.5575 | 0.5597 | 0.9585 |
104
+ | 0.0056 | 13.3877 | 23000 | 0.3033 | 0.5440 | 0.5836 | 0.5631 | 0.9586 |
105
+ | 0.005 | 13.6787 | 23500 | 0.3083 | 0.5567 | 0.5741 | 0.5653 | 0.9585 |
106
+ | 0.0051 | 13.9697 | 24000 | 0.3201 | 0.5510 | 0.5891 | 0.5694 | 0.9592 |
107
+ | 0.0041 | 14.2608 | 24500 | 0.3265 | 0.5445 | 0.5687 | 0.5563 | 0.9586 |
108
+ | 0.0043 | 14.5518 | 25000 | 0.3202 | 0.5634 | 0.5641 | 0.5638 | 0.9586 |
109
+ | 0.0045 | 14.8428 | 25500 | 0.3200 | 0.5704 | 0.5677 | 0.5691 | 0.9597 |
110
+ | 0.0042 | 15.1339 | 26000 | 0.3285 | 0.5651 | 0.5770 | 0.5710 | 0.9595 |
111
+ | 0.0035 | 15.4249 | 26500 | 0.3259 | 0.5575 | 0.5846 | 0.5707 | 0.9590 |
112
+ | 0.0038 | 15.7159 | 27000 | 0.3300 | 0.5711 | 0.5685 | 0.5698 | 0.9595 |
113
+ | 0.0034 | 16.0070 | 27500 | 0.3244 | 0.5552 | 0.5771 | 0.5660 | 0.9586 |
114
+ | 0.003 | 16.2980 | 28000 | 0.3310 | 0.5683 | 0.5778 | 0.5730 | 0.9596 |
115
+ | 0.0031 | 16.5891 | 28500 | 0.3295 | 0.5629 | 0.5800 | 0.5713 | 0.9595 |
116
+ | 0.0029 | 16.8801 | 29000 | 0.3302 | 0.5396 | 0.5992 | 0.5679 | 0.9584 |
117
+ | 0.0028 | 17.1711 | 29500 | 0.3350 | 0.5519 | 0.5826 | 0.5668 | 0.9592 |
118
+ | 0.0029 | 17.4622 | 30000 | 0.3269 | 0.5360 | 0.6158 | 0.5732 | 0.9580 |
119
+ | 0.0023 | 17.7532 | 30500 | 0.3400 | 0.5731 | 0.5801 | 0.5766 | 0.9598 |
120
+ | 0.0022 | 18.0442 | 31000 | 0.3345 | 0.5716 | 0.5649 | 0.5682 | 0.9593 |
121
+ | 0.0022 | 18.3353 | 31500 | 0.3301 | 0.5589 | 0.5966 | 0.5771 | 0.9594 |
122
+ | 0.002 | 18.6263 | 32000 | 0.3406 | 0.5702 | 0.5774 | 0.5738 | 0.9596 |
123
+ | 0.0025 | 18.9173 | 32500 | 0.3422 | 0.5943 | 0.5380 | 0.5647 | 0.9595 |
124
+ | 0.0019 | 19.2084 | 33000 | 0.3476 | 0.5783 | 0.5728 | 0.5755 | 0.9600 |
125
+ | 0.0019 | 19.4994 | 33500 | 0.3449 | 0.5620 | 0.5910 | 0.5761 | 0.9596 |
126
+ | 0.0016 | 19.7905 | 34000 | 0.3518 | 0.5634 | 0.5926 | 0.5776 | 0.9595 |
127
+ | 0.0014 | 20.0815 | 34500 | 0.3522 | 0.5633 | 0.5820 | 0.5725 | 0.9598 |
128
+ | 0.0014 | 20.3725 | 35000 | 0.3486 | 0.5744 | 0.5739 | 0.5742 | 0.9598 |
129
+ | 0.0014 | 20.6636 | 35500 | 0.3513 | 0.5762 | 0.5737 | 0.5749 | 0.9596 |
130
+ | 0.0013 | 20.9546 | 36000 | 0.3608 | 0.5406 | 0.5851 | 0.5619 | 0.9586 |
131
+ | 0.0015 | 21.2456 | 36500 | 0.3548 | 0.5814 | 0.5696 | 0.5754 | 0.9602 |
132
+ | 0.0011 | 21.5367 | 37000 | 0.3552 | 0.5829 | 0.5700 | 0.5764 | 0.9600 |
133
+ | 0.0013 | 21.8277 | 37500 | 0.3565 | 0.5623 | 0.5797 | 0.5709 | 0.9593 |
134
+ | 0.0014 | 22.1187 | 38000 | 0.3608 | 0.5791 | 0.5693 | 0.5742 | 0.9603 |
135
+ | 0.001 | 22.4098 | 38500 | 0.3534 | 0.5706 | 0.5914 | 0.5808 | 0.9601 |
136
+ | 0.001 | 22.7008 | 39000 | 0.3639 | 0.5887 | 0.5719 | 0.5802 | 0.9603 |
137
+ | 0.0009 | 22.9919 | 39500 | 0.3650 | 0.5679 | 0.5898 | 0.5787 | 0.9597 |
138
+ | 0.0008 | 23.2829 | 40000 | 0.3676 | 0.5815 | 0.5744 | 0.5779 | 0.9602 |
139
+ | 0.0007 | 23.5739 | 40500 | 0.3738 | 0.5944 | 0.5680 | 0.5809 | 0.9606 |
140
+ | 0.001 | 23.8650 | 41000 | 0.3700 | 0.5804 | 0.5735 | 0.5769 | 0.9597 |
141
+ | 0.0009 | 24.1560 | 41500 | 0.3706 | 0.5774 | 0.5768 | 0.5771 | 0.9601 |
142
+ | 0.0008 | 24.4470 | 42000 | 0.3696 | 0.5838 | 0.5731 | 0.5784 | 0.9604 |
143
+ | 0.0007 | 24.7381 | 42500 | 0.3737 | 0.5656 | 0.5862 | 0.5757 | 0.9598 |
144
+ | 0.0007 | 25.0291 | 43000 | 0.3756 | 0.5617 | 0.5871 | 0.5741 | 0.9594 |
145
+ | 0.0006 | 25.3201 | 43500 | 0.3757 | 0.5668 | 0.5885 | 0.5774 | 0.9593 |
146
+ | 0.0004 | 25.6112 | 44000 | 0.3783 | 0.5865 | 0.5708 | 0.5785 | 0.9602 |
147
+ | 0.0006 | 25.9022 | 44500 | 0.3688 | 0.5724 | 0.5902 | 0.5812 | 0.9600 |
148
+ | 0.0004 | 26.1932 | 45000 | 0.3783 | 0.5851 | 0.5787 | 0.5819 | 0.9605 |
149
+ | 0.0004 | 26.4843 | 45500 | 0.3809 | 0.5773 | 0.5780 | 0.5776 | 0.9601 |
150
+ | 0.0004 | 26.7753 | 46000 | 0.3816 | 0.5823 | 0.5803 | 0.5813 | 0.9605 |
151
+ | 0.0004 | 27.0664 | 46500 | 0.3887 | 0.5762 | 0.5820 | 0.5791 | 0.9601 |
152
+ | 0.0003 | 27.3574 | 47000 | 0.3833 | 0.5834 | 0.5869 | 0.5852 | 0.9605 |
153
+ | 0.0004 | 27.6484 | 47500 | 0.3867 | 0.5791 | 0.5891 | 0.5840 | 0.9605 |
154
+ | 0.0004 | 27.9395 | 48000 | 0.3876 | 0.5814 | 0.5856 | 0.5835 | 0.9605 |
155
+ | 0.0003 | 28.2305 | 48500 | 0.3918 | 0.5810 | 0.5742 | 0.5776 | 0.9605 |
156
+ | 0.0003 | 28.5215 | 49000 | 0.3869 | 0.5804 | 0.5826 | 0.5815 | 0.9606 |
157
+ | 0.0003 | 28.8126 | 49500 | 0.3864 | 0.5760 | 0.5871 | 0.5815 | 0.9604 |
158
+ | 0.0004 | 29.1036 | 50000 | 0.3840 | 0.5732 | 0.5887 | 0.5808 | 0.9604 |
159
+ | 0.0003 | 29.3946 | 50500 | 0.3864 | 0.5833 | 0.5784 | 0.5808 | 0.9606 |
160
+ | 0.0002 | 29.6857 | 51000 | 0.3852 | 0.5781 | 0.5842 | 0.5811 | 0.9604 |
161
+ | 0.0002 | 29.9767 | 51500 | 0.3854 | 0.5785 | 0.5835 | 0.5810 | 0.9605 |
162
+
163
+
164
+ ### Framework versions
165
+
166
+ - Transformers 4.44.2
167
+ - Pytorch 2.1.1+cu121
168
+ - Datasets 2.14.5
169
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "FacebookAI/xlm-roberta-base",
3
+ "architectures": [
4
+ "XLMRobertaForTokenClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "gelu",
11
+ "hidden_dropout_prob": 0.1,
12
+ "hidden_size": 768,
13
+ "id2label": {
14
+ "0": "LABEL_0",
15
+ "1": "LABEL_1",
16
+ "2": "LABEL_2",
17
+ "3": "LABEL_3",
18
+ "4": "LABEL_4",
19
+ "5": "LABEL_5",
20
+ "6": "LABEL_6"
21
+ },
22
+ "initializer_range": 0.02,
23
+ "intermediate_size": 3072,
24
+ "label2id": {
25
+ "LABEL_0": 0,
26
+ "LABEL_1": 1,
27
+ "LABEL_2": 2,
28
+ "LABEL_3": 3,
29
+ "LABEL_4": 4,
30
+ "LABEL_5": 5,
31
+ "LABEL_6": 6
32
+ },
33
+ "layer_norm_eps": 1e-05,
34
+ "max_position_embeddings": 514,
35
+ "model_type": "xlm-roberta",
36
+ "num_attention_heads": 12,
37
+ "num_hidden_layers": 6,
38
+ "output_past": true,
39
+ "pad_token_id": 1,
40
+ "position_embedding_type": "absolute",
41
+ "torch_dtype": "float32",
42
+ "transformers_version": "4.44.2",
43
+ "type_vocab_size": 1,
44
+ "use_cache": true,
45
+ "vocab_size": 250002
46
+ }
eval_result_ner.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"ceb_gja": {"precision": 0.34146341463414637, "recall": 0.5714285714285714, "f1": 0.42748091603053434, "accuracy": 0.9420849420849421}, "en_pud": {"precision": 0.47854077253218885, "recall": 0.41488372093023257, "f1": 0.4444444444444445, "accuracy": 0.947912731394031}, "de_pud": {"precision": 0.10958395245170877, "recall": 0.28392685274302215, "f1": 0.1581345483784508, "accuracy": 0.8299657775069148}, "pt_pud": {"precision": 0.5428571428571428, "recall": 0.5359417652411284, "f1": 0.5393772893772893, "accuracy": 0.9595420173452386}, "ru_pud": {"precision": 0.017678927858568578, "recall": 0.059845559845559844, "f1": 0.027294739159145938, "accuracy": 0.6978041849651253}, "sv_pud": {"precision": 0.5131195335276968, "recall": 0.34207968901846453, "f1": 0.41049562682215746, "accuracy": 0.9459530299853218}, "tl_trg": {"precision": 0.26785714285714285, "recall": 0.6521739130434783, "f1": 0.379746835443038, "accuracy": 0.9359673024523161}, "tl_ugnayan": {"precision": 0.06349206349206349, "recall": 0.12121212121212122, "f1": 0.08333333333333333, "accuracy": 0.9115770282588879}, "zh_gsd": {"precision": 0.4869451697127937, "recall": 0.4863102998696219, "f1": 0.4866275277234181, "accuracy": 0.9309024309024309}, "zh_gsdsimp": {"precision": 0.5097024579560155, "recall": 0.5163826998689384, "f1": 0.5130208333333334, "accuracy": 0.933982683982684}, "hr_set": {"precision": 0.7109428768066071, "recall": 0.7362794012829651, "f1": 0.7233893557422969, "accuracy": 0.9689200329760923}, "da_ddt": {"precision": 0.6278481012658228, "recall": 0.5548098434004475, "f1": 0.5890736342042755, "accuracy": 0.9703681532475307}, "en_ewt": {"precision": 0.5821989528795811, "recall": 0.5110294117647058, "f1": 0.544297601566324, "accuracy": 0.9587600111567119}, "pt_bosque": {"precision": 0.5980629539951574, "recall": 0.6098765432098765, "f1": 0.6039119804400979, "accuracy": 0.964606578756702}, "sr_set": {"precision": 0.7525655644241733, "recall": 0.7792207792207793, "f1": 0.7656612529002321, "accuracy": 0.9677786533578496}, "sk_snk": {"precision": 0.3911719939117199, "recall": 0.28087431693989073, "f1": 0.3269720101781171, "accuracy": 0.9172424623115578}, "sv_talbanken": {"precision": 0.7073170731707317, "recall": 0.5918367346938775, "f1": 0.6444444444444445, "accuracy": 0.9937184080090298}}
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e1029fb696455d98f4156018d8aec8b4e8e023ca82e935f9c5a76697968021f6
3
+ size 939737140
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bd5a637004b1fdc6106de5654b411ea7fe154bfd70a1ff35346e629d601c78b4
3
+ size 5304