haryoaw commited on
Commit
6c7c418
1 Parent(s): 7820fc7

Initial Commit

Browse files
Files changed (4) hide show
  1. README.md +64 -59
  2. eval_result_ner.json +1 -1
  3. model.safetensors +1 -1
  4. training_args.bin +1 -1
README.md CHANGED
@@ -1,14 +1,14 @@
1
  ---
2
- base_model: FacebookAI/xlm-roberta-base
3
  library_name: transformers
4
  license: mit
 
 
 
5
  metrics:
6
  - precision
7
  - recall
8
  - f1
9
  - accuracy
10
- tags:
11
- - generated_from_trainer
12
  model-index:
13
  - name: scenario-kd-pre-ner-full_data-univner_full55
14
  results: []
@@ -21,11 +21,11 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.4361
25
- - Precision: 0.8002
26
- - Recall: 0.7742
27
- - F1: 0.7870
28
- - Accuracy: 0.9784
29
 
30
  ## Model description
31
 
@@ -56,57 +56,62 @@ The following hyperparameters were used during training:
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
  |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
- | 1.491 | 0.5828 | 500 | 0.8224 | 0.7083 | 0.6591 | 0.6828 | 0.9695 |
60
- | 0.7215 | 1.1655 | 1000 | 0.7072 | 0.6822 | 0.7477 | 0.7134 | 0.9723 |
61
- | 0.5905 | 1.7483 | 1500 | 0.6551 | 0.7162 | 0.7595 | 0.7372 | 0.9742 |
62
- | 0.4917 | 2.3310 | 2000 | 0.6188 | 0.7497 | 0.7477 | 0.7487 | 0.9745 |
63
- | 0.4427 | 2.9138 | 2500 | 0.5807 | 0.7540 | 0.7404 | 0.7472 | 0.9746 |
64
- | 0.3833 | 3.4965 | 3000 | 0.5559 | 0.7668 | 0.7389 | 0.7526 | 0.9758 |
65
- | 0.3593 | 4.0793 | 3500 | 0.5443 | 0.7843 | 0.7374 | 0.7601 | 0.9758 |
66
- | 0.3224 | 4.6620 | 4000 | 0.5436 | 0.7700 | 0.7593 | 0.7646 | 0.9762 |
67
- | 0.3045 | 5.2448 | 4500 | 0.5263 | 0.7678 | 0.7402 | 0.7537 | 0.9759 |
68
- | 0.2861 | 5.8275 | 5000 | 0.5152 | 0.7741 | 0.7667 | 0.7704 | 0.9767 |
69
- | 0.2655 | 6.4103 | 5500 | 0.5137 | 0.7786 | 0.7748 | 0.7767 | 0.9774 |
70
- | 0.2565 | 6.9930 | 6000 | 0.5072 | 0.7848 | 0.7579 | 0.7711 | 0.9770 |
71
- | 0.2374 | 7.5758 | 6500 | 0.5084 | 0.7926 | 0.7505 | 0.7710 | 0.9767 |
72
- | 0.2347 | 8.1585 | 7000 | 0.5015 | 0.7884 | 0.7474 | 0.7674 | 0.9769 |
73
- | 0.2233 | 8.7413 | 7500 | 0.4902 | 0.7811 | 0.7696 | 0.7753 | 0.9772 |
74
- | 0.2111 | 9.3240 | 8000 | 0.4921 | 0.7836 | 0.7582 | 0.7707 | 0.9770 |
75
- | 0.2081 | 9.9068 | 8500 | 0.4856 | 0.7826 | 0.7700 | 0.7762 | 0.9777 |
76
- | 0.1987 | 10.4895 | 9000 | 0.4814 | 0.7796 | 0.7756 | 0.7776 | 0.9775 |
77
- | 0.1945 | 11.0723 | 9500 | 0.4864 | 0.7816 | 0.7668 | 0.7742 | 0.9773 |
78
- | 0.1867 | 11.6550 | 10000 | 0.4797 | 0.7925 | 0.7676 | 0.7798 | 0.9775 |
79
- | 0.1804 | 12.2378 | 10500 | 0.4711 | 0.7826 | 0.7810 | 0.7818 | 0.9779 |
80
- | 0.1775 | 12.8205 | 11000 | 0.4759 | 0.7808 | 0.7713 | 0.7760 | 0.9773 |
81
- | 0.1697 | 13.4033 | 11500 | 0.4675 | 0.7824 | 0.7801 | 0.7812 | 0.9774 |
82
- | 0.1704 | 13.9860 | 12000 | 0.4587 | 0.7937 | 0.7769 | 0.7852 | 0.9780 |
83
- | 0.1625 | 14.5688 | 12500 | 0.4680 | 0.7943 | 0.7748 | 0.7844 | 0.9782 |
84
- | 0.1606 | 15.1515 | 13000 | 0.4604 | 0.7932 | 0.7719 | 0.7824 | 0.9777 |
85
- | 0.1563 | 15.7343 | 13500 | 0.4633 | 0.7981 | 0.7699 | 0.7837 | 0.9779 |
86
- | 0.1528 | 16.3170 | 14000 | 0.4614 | 0.7987 | 0.7775 | 0.7880 | 0.9781 |
87
- | 0.1503 | 16.8998 | 14500 | 0.4547 | 0.7915 | 0.7749 | 0.7831 | 0.9782 |
88
- | 0.1473 | 17.4825 | 15000 | 0.4631 | 0.8024 | 0.7664 | 0.7840 | 0.9776 |
89
- | 0.146 | 18.0653 | 15500 | 0.4630 | 0.7970 | 0.7579 | 0.7770 | 0.9775 |
90
- | 0.1431 | 18.6480 | 16000 | 0.4498 | 0.7916 | 0.7749 | 0.7832 | 0.9778 |
91
- | 0.14 | 19.2308 | 16500 | 0.4518 | 0.7990 | 0.7676 | 0.7830 | 0.9779 |
92
- | 0.1377 | 19.8135 | 17000 | 0.4508 | 0.7922 | 0.7794 | 0.7857 | 0.9779 |
93
- | 0.1356 | 20.3963 | 17500 | 0.4538 | 0.8019 | 0.7664 | 0.7838 | 0.9775 |
94
- | 0.1351 | 20.9790 | 18000 | 0.4567 | 0.7865 | 0.7765 | 0.7815 | 0.9779 |
95
- | 0.1325 | 21.5618 | 18500 | 0.4516 | 0.7987 | 0.7705 | 0.7843 | 0.9779 |
96
- | 0.1305 | 22.1445 | 19000 | 0.4468 | 0.8033 | 0.7735 | 0.7881 | 0.9784 |
97
- | 0.1298 | 22.7273 | 19500 | 0.4453 | 0.8065 | 0.7651 | 0.7853 | 0.9777 |
98
- | 0.1274 | 23.3100 | 20000 | 0.4471 | 0.8013 | 0.7726 | 0.7867 | 0.9783 |
99
- | 0.1271 | 23.8928 | 20500 | 0.4441 | 0.7957 | 0.7813 | 0.7884 | 0.9781 |
100
- | 0.1257 | 24.4755 | 21000 | 0.4431 | 0.8030 | 0.7768 | 0.7897 | 0.9784 |
101
- | 0.1247 | 25.0583 | 21500 | 0.4459 | 0.7954 | 0.7741 | 0.7846 | 0.9784 |
102
- | 0.1232 | 25.6410 | 22000 | 0.4471 | 0.8013 | 0.7710 | 0.7859 | 0.9781 |
103
- | 0.1225 | 26.2238 | 22500 | 0.4436 | 0.7931 | 0.7706 | 0.7817 | 0.9776 |
104
- | 0.1225 | 26.8065 | 23000 | 0.4394 | 0.7926 | 0.7716 | 0.7820 | 0.9781 |
105
- | 0.1211 | 27.3893 | 23500 | 0.4427 | 0.8010 | 0.7710 | 0.7857 | 0.9779 |
106
- | 0.1199 | 27.9720 | 24000 | 0.4377 | 0.8042 | 0.7739 | 0.7888 | 0.9783 |
107
- | 0.1204 | 28.5548 | 24500 | 0.4408 | 0.8048 | 0.7777 | 0.7910 | 0.9785 |
108
- | 0.1191 | 29.1375 | 25000 | 0.4390 | 0.8003 | 0.7692 | 0.7844 | 0.9782 |
109
- | 0.1187 | 29.7203 | 25500 | 0.4361 | 0.8002 | 0.7742 | 0.7870 | 0.9784 |
 
 
 
 
 
110
 
111
 
112
  ### Framework versions
 
1
  ---
 
2
  library_name: transformers
3
  license: mit
4
+ base_model: FacebookAI/xlm-roberta-base
5
+ tags:
6
+ - generated_from_trainer
7
  metrics:
8
  - precision
9
  - recall
10
  - f1
11
  - accuracy
 
 
12
  model-index:
13
  - name: scenario-kd-pre-ner-full_data-univner_full55
14
  results: []
 
21
 
22
  This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 0.4412
25
+ - Precision: 0.8211
26
+ - Recall: 0.8091
27
+ - F1: 0.8151
28
+ - Accuracy: 0.9808
29
 
30
  ## Model description
31
 
 
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
  |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
+ | 1.3575 | 0.2910 | 500 | 0.9103 | 0.6705 | 0.6653 | 0.6679 | 0.9689 |
60
+ | 0.7321 | 0.5821 | 1000 | 0.7344 | 0.7247 | 0.7273 | 0.7260 | 0.9738 |
61
+ | 0.6472 | 0.8731 | 1500 | 0.6643 | 0.7405 | 0.7642 | 0.7522 | 0.9759 |
62
+ | 0.5635 | 1.1641 | 2000 | 0.6244 | 0.7627 | 0.7720 | 0.7673 | 0.9775 |
63
+ | 0.4932 | 1.4552 | 2500 | 0.6102 | 0.7445 | 0.7855 | 0.7644 | 0.9760 |
64
+ | 0.4871 | 1.7462 | 3000 | 0.5773 | 0.7682 | 0.7847 | 0.7764 | 0.9778 |
65
+ | 0.4543 | 2.0373 | 3500 | 0.5692 | 0.7888 | 0.7834 | 0.7861 | 0.9786 |
66
+ | 0.4077 | 2.3283 | 4000 | 0.5501 | 0.7671 | 0.8003 | 0.7834 | 0.9785 |
67
+ | 0.3882 | 2.6193 | 4500 | 0.5512 | 0.7822 | 0.7831 | 0.7827 | 0.9784 |
68
+ | 0.3826 | 2.9104 | 5000 | 0.5284 | 0.7860 | 0.7934 | 0.7897 | 0.9789 |
69
+ | 0.3527 | 3.2014 | 5500 | 0.5283 | 0.7854 | 0.7984 | 0.7919 | 0.9793 |
70
+ | 0.3353 | 3.4924 | 6000 | 0.5180 | 0.7964 | 0.8023 | 0.7993 | 0.9794 |
71
+ | 0.3336 | 3.7835 | 6500 | 0.5079 | 0.7831 | 0.8042 | 0.7935 | 0.9792 |
72
+ | 0.3176 | 4.0745 | 7000 | 0.4999 | 0.7927 | 0.8140 | 0.8032 | 0.9798 |
73
+ | 0.2974 | 4.3655 | 7500 | 0.4975 | 0.8068 | 0.8044 | 0.8056 | 0.9797 |
74
+ | 0.2932 | 4.6566 | 8000 | 0.5007 | 0.7983 | 0.7917 | 0.7950 | 0.9792 |
75
+ | 0.291 | 4.9476 | 8500 | 0.5011 | 0.7919 | 0.7979 | 0.7949 | 0.9788 |
76
+ | 0.2684 | 5.2386 | 9000 | 0.5011 | 0.8014 | 0.8032 | 0.8023 | 0.9801 |
77
+ | 0.2636 | 5.5297 | 9500 | 0.4938 | 0.8079 | 0.7943 | 0.8010 | 0.9796 |
78
+ | 0.2636 | 5.8207 | 10000 | 0.4924 | 0.8067 | 0.8009 | 0.8038 | 0.9800 |
79
+ | 0.255 | 6.1118 | 10500 | 0.4796 | 0.7997 | 0.8075 | 0.8036 | 0.9804 |
80
+ | 0.2417 | 6.4028 | 11000 | 0.4982 | 0.8030 | 0.7990 | 0.8010 | 0.9796 |
81
+ | 0.2423 | 6.6938 | 11500 | 0.4827 | 0.7932 | 0.8129 | 0.8029 | 0.9797 |
82
+ | 0.2377 | 6.9849 | 12000 | 0.4774 | 0.8135 | 0.8080 | 0.8107 | 0.9805 |
83
+ | 0.2208 | 7.2759 | 12500 | 0.4759 | 0.8157 | 0.8078 | 0.8117 | 0.9809 |
84
+ | 0.2228 | 7.5669 | 13000 | 0.4669 | 0.8140 | 0.8139 | 0.8139 | 0.9808 |
85
+ | 0.2224 | 7.8580 | 13500 | 0.4762 | 0.8111 | 0.8088 | 0.8099 | 0.9806 |
86
+ | 0.2154 | 8.1490 | 14000 | 0.4756 | 0.8163 | 0.8085 | 0.8124 | 0.9806 |
87
+ | 0.2057 | 8.4400 | 14500 | 0.4751 | 0.8127 | 0.8097 | 0.8112 | 0.9805 |
88
+ | 0.2072 | 8.7311 | 15000 | 0.4678 | 0.8035 | 0.8146 | 0.8090 | 0.9803 |
89
+ | 0.2023 | 9.0221 | 15500 | 0.4678 | 0.8213 | 0.8065 | 0.8139 | 0.9805 |
90
+ | 0.1951 | 9.3132 | 16000 | 0.4665 | 0.7996 | 0.8096 | 0.8046 | 0.9802 |
91
+ | 0.1928 | 9.6042 | 16500 | 0.4695 | 0.8157 | 0.8106 | 0.8131 | 0.9805 |
92
+ | 0.1925 | 9.8952 | 17000 | 0.4607 | 0.8112 | 0.8127 | 0.8120 | 0.9805 |
93
+ | 0.1876 | 10.1863 | 17500 | 0.4573 | 0.8087 | 0.8247 | 0.8166 | 0.9811 |
94
+ | 0.1825 | 10.4773 | 18000 | 0.4520 | 0.8147 | 0.8293 | 0.8220 | 0.9817 |
95
+ | 0.1796 | 10.7683 | 18500 | 0.4566 | 0.8137 | 0.8146 | 0.8141 | 0.9807 |
96
+ | 0.1809 | 11.0594 | 19000 | 0.4524 | 0.8231 | 0.8137 | 0.8184 | 0.9810 |
97
+ | 0.1704 | 11.3504 | 19500 | 0.4593 | 0.8130 | 0.8156 | 0.8143 | 0.9809 |
98
+ | 0.1729 | 11.6414 | 20000 | 0.4549 | 0.8225 | 0.8075 | 0.8149 | 0.9809 |
99
+ | 0.173 | 11.9325 | 20500 | 0.4620 | 0.8166 | 0.8166 | 0.8166 | 0.9809 |
100
+ | 0.1656 | 12.2235 | 21000 | 0.4467 | 0.8015 | 0.8070 | 0.8042 | 0.9804 |
101
+ | 0.1623 | 12.5146 | 21500 | 0.4504 | 0.8139 | 0.8247 | 0.8193 | 0.9813 |
102
+ | 0.1651 | 12.8056 | 22000 | 0.4496 | 0.8208 | 0.8142 | 0.8175 | 0.9809 |
103
+ | 0.1595 | 13.0966 | 22500 | 0.4448 | 0.8141 | 0.8172 | 0.8157 | 0.9810 |
104
+ | 0.1561 | 13.3877 | 23000 | 0.4496 | 0.8187 | 0.8162 | 0.8174 | 0.9811 |
105
+ | 0.1576 | 13.6787 | 23500 | 0.4509 | 0.8198 | 0.8124 | 0.8161 | 0.9810 |
106
+ | 0.1563 | 13.9697 | 24000 | 0.4445 | 0.8205 | 0.8119 | 0.8162 | 0.9809 |
107
+ | 0.15 | 14.2608 | 24500 | 0.4398 | 0.8179 | 0.8152 | 0.8165 | 0.9812 |
108
+ | 0.153 | 14.5518 | 25000 | 0.4460 | 0.8281 | 0.8071 | 0.8175 | 0.9811 |
109
+ | 0.1482 | 14.8428 | 25500 | 0.4480 | 0.8246 | 0.8145 | 0.8195 | 0.9809 |
110
+ | 0.1485 | 15.1339 | 26000 | 0.4438 | 0.8199 | 0.8175 | 0.8187 | 0.9810 |
111
+ | 0.1449 | 15.4249 | 26500 | 0.4423 | 0.8216 | 0.8106 | 0.8160 | 0.9808 |
112
+ | 0.1455 | 15.7159 | 27000 | 0.4440 | 0.8181 | 0.8078 | 0.8129 | 0.9807 |
113
+ | 0.1438 | 16.0070 | 27500 | 0.4437 | 0.8298 | 0.8119 | 0.8207 | 0.9812 |
114
+ | 0.1396 | 16.2980 | 28000 | 0.4412 | 0.8211 | 0.8091 | 0.8151 | 0.9808 |
115
 
116
 
117
  ### Framework versions
eval_result_ner.json CHANGED
@@ -1 +1 @@
1
- {"ceb_gja": {"precision": 0.5, "recall": 0.6122448979591837, "f1": 0.5504587155963303, "accuracy": 0.9598455598455599}, "en_pud": {"precision": 0.7825242718446602, "recall": 0.7497674418604651, "f1": 0.7657957244655581, "accuracy": 0.9777578390630903}, "de_pud": {"precision": 0.768920282542886, "recall": 0.7333974975938402, "f1": 0.7507389162561576, "accuracy": 0.9729032862969387}, "pt_pud": {"precision": 0.8147792706333973, "recall": 0.7725204731574158, "f1": 0.7930873423633815, "accuracy": 0.9798778143290469}, "ru_pud": {"precision": 0.6817743490838959, "recall": 0.6824324324324325, "f1": 0.6821032320308732, "accuracy": 0.9677602686644278}, "sv_pud": {"precision": 0.8353783231083844, "recall": 0.793974732750243, "f1": 0.8141504733432984, "accuracy": 0.9813902285594465}, "tl_trg": {"precision": 0.7916666666666666, "recall": 0.8260869565217391, "f1": 0.8085106382978724, "accuracy": 0.989100817438692}, "tl_ugnayan": {"precision": 0.5641025641025641, "recall": 0.6666666666666666, "f1": 0.611111111111111, "accuracy": 0.9690063810391978}, "zh_gsd": {"precision": 0.8253333333333334, "recall": 0.8070404172099087, "f1": 0.8160843770599867, "accuracy": 0.9731934731934732}, "zh_gsdsimp": {"precision": 0.8169582772543742, "recall": 0.7955439056356488, "f1": 0.8061088977423638, "accuracy": 0.9729437229437229}, "hr_set": {"precision": 0.883298392732355, "recall": 0.9009265858873842, "f1": 0.8920254057868737, "accuracy": 0.9871393239901072}, "da_ddt": {"precision": 0.8075, "recall": 0.7225950782997763, "f1": 0.7626918536009446, "accuracy": 0.9823406165818617}, "en_ewt": {"precision": 0.8160804020100503, "recall": 0.7463235294117647, "f1": 0.7796447431589055, "accuracy": 0.9774076582858509}, "pt_bosque": {"precision": 0.766269477543538, "recall": 0.6880658436213992, "f1": 0.7250650477016479, "accuracy": 0.974967396029561}, "sr_set": {"precision": 0.9122182680901542, "recall": 0.9079102715466352, "f1": 0.9100591715976329, "accuracy": 0.9867787409158567}, "sk_snk": {"precision": 0.700228832951945, "recall": 0.6688524590163935, "f1": 0.6841811067635551, "accuracy": 0.9583856783919598}, "sv_talbanken": {"precision": 0.8620689655172413, "recall": 0.8928571428571429, "f1": 0.8771929824561403, "accuracy": 0.9975953280659567}}
 
1
+ {"ceb_gja": {"precision": 0.5277777777777778, "recall": 0.7755102040816326, "f1": 0.628099173553719, "accuracy": 0.9667953667953668}, "en_pud": {"precision": 0.7726001863932899, "recall": 0.7711627906976745, "f1": 0.7718808193668529, "accuracy": 0.9782772950510011}, "de_pud": {"precision": 0.7163375224416517, "recall": 0.7680461982675649, "f1": 0.7412912215513239, "accuracy": 0.9727157657868829}, "pt_pud": {"precision": 0.813953488372093, "recall": 0.8598726114649682, "f1": 0.836283185840708, "accuracy": 0.9848763190498568}, "ru_pud": {"precision": 0.6829727187206021, "recall": 0.7007722007722008, "f1": 0.6917579799904717, "accuracy": 0.968741927150607}, "sv_pud": {"precision": 0.8398009950248756, "recall": 0.8202137998056366, "f1": 0.8298918387413963, "accuracy": 0.9834346823233382}, "tl_trg": {"precision": 0.7272727272727273, "recall": 0.6956521739130435, "f1": 0.711111111111111, "accuracy": 0.9863760217983651}, "tl_ugnayan": {"precision": 0.6, "recall": 0.7272727272727273, "f1": 0.6575342465753425, "accuracy": 0.9735642661804923}, "zh_gsd": {"precision": 0.8, "recall": 0.8083441981747066, "f1": 0.8041504539559015, "accuracy": 0.9736097236097236}, "zh_gsdsimp": {"precision": 0.7984293193717278, "recall": 0.799475753604194, "f1": 0.7989521938441388, "accuracy": 0.9734432234432234}, "hr_set": {"precision": 0.8806896551724138, "recall": 0.910192444761226, "f1": 0.8951980371538731, "accuracy": 0.9875927452596868}, "da_ddt": {"precision": 0.8558139534883721, "recall": 0.8232662192393736, "f1": 0.8392246294184721, "accuracy": 0.9881273071934551}, "en_ewt": {"precision": 0.7678227360308285, "recall": 0.7325367647058824, "f1": 0.7497648165569145, "accuracy": 0.9761326054906961}, "pt_bosque": {"precision": 0.8447154471544716, "recall": 0.8551440329218107, "f1": 0.8498977505112475, "accuracy": 0.9859078394435589}, "sr_set": {"precision": 0.9241706161137441, "recall": 0.9208972845336482, "f1": 0.9225310467179184, "accuracy": 0.9883547850450923}, "sk_snk": {"precision": 0.7613122171945701, "recall": 0.7355191256830601, "f1": 0.7481934408004448, "accuracy": 0.9659233668341709}, "sv_talbanken": {"precision": 0.8037383177570093, "recall": 0.8775510204081632, "f1": 0.8390243902439024, "accuracy": 0.9970064288168032}}
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:643f7b4cf904671e5d0ec0771a4bc8a989162e296d51aee05df95c762b2c4f5d
3
  size 939737140
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:625c7639248af8b0a841fb47f36ef96c61015a3e7a80a0ad1cecd74e54c50e6f
3
  size 939737140
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e454b36d42c5bb4158593fff1b7a25eb0d1df5d05f30bf2638a4127eb3903afb
3
  size 5304
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dfd55ffdaee17acabce138d8a4aae3a492347dce4be84be815d157c9f4cd8f02
3
  size 5304