haryoaw commited on
Commit
a556fb9
1 Parent(s): 4ad05b2

Initial Commit

Browse files
Files changed (4) hide show
  1. README.md +58 -58
  2. eval_result_ner.json +1 -1
  3. model.safetensors +1 -1
  4. training_args.bin +1 -1
README.md CHANGED
@@ -1,14 +1,14 @@
1
  ---
2
- base_model: FacebookAI/xlm-roberta-base
3
  library_name: transformers
4
  license: mit
 
 
 
5
  metrics:
6
  - precision
7
  - recall
8
  - f1
9
  - accuracy
10
- tags:
11
- - generated_from_trainer
12
  model-index:
13
  - name: scenario-non-kd-pre-ner-full-xlmr_data-univner_half44
14
  results: []
@@ -21,10 +21,10 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.1692
25
- - Precision: 0.7997
26
- - Recall: 0.8083
27
- - F1: 0.8040
28
  - Accuracy: 0.9794
29
 
30
  ## Model description
@@ -56,57 +56,57 @@ The following hyperparameters were used during training:
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
  |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
- | 0.1456 | 0.5828 | 500 | 0.0795 | 0.7321 | 0.7445 | 0.7383 | 0.9744 |
60
- | 0.0686 | 1.1655 | 1000 | 0.0786 | 0.7478 | 0.7850 | 0.7660 | 0.9763 |
61
- | 0.0505 | 1.7483 | 1500 | 0.0765 | 0.7499 | 0.8058 | 0.7768 | 0.9775 |
62
- | 0.039 | 2.3310 | 2000 | 0.0850 | 0.7407 | 0.8025 | 0.7704 | 0.9759 |
63
- | 0.0308 | 2.9138 | 2500 | 0.0935 | 0.7340 | 0.8178 | 0.7736 | 0.9755 |
64
- | 0.0222 | 3.4965 | 3000 | 0.0941 | 0.7587 | 0.8097 | 0.7834 | 0.9776 |
65
- | 0.0202 | 4.0793 | 3500 | 0.0978 | 0.7615 | 0.8139 | 0.7868 | 0.9781 |
66
- | 0.0149 | 4.6620 | 4000 | 0.1010 | 0.7738 | 0.8039 | 0.7886 | 0.9780 |
67
- | 0.0134 | 5.2448 | 4500 | 0.1072 | 0.7917 | 0.7973 | 0.7945 | 0.9791 |
68
- | 0.0112 | 5.8275 | 5000 | 0.1023 | 0.7866 | 0.7948 | 0.7907 | 0.9789 |
69
- | 0.0091 | 6.4103 | 5500 | 0.1151 | 0.7765 | 0.8083 | 0.7921 | 0.9784 |
70
- | 0.0088 | 6.9930 | 6000 | 0.1144 | 0.7838 | 0.7980 | 0.7908 | 0.9785 |
71
- | 0.0077 | 7.5758 | 6500 | 0.1150 | 0.7748 | 0.7979 | 0.7862 | 0.9780 |
72
- | 0.0067 | 8.1585 | 7000 | 0.1191 | 0.7889 | 0.7960 | 0.7924 | 0.9787 |
73
- | 0.0059 | 8.7413 | 7500 | 0.1269 | 0.7703 | 0.8140 | 0.7916 | 0.9780 |
74
- | 0.0053 | 9.3240 | 8000 | 0.1267 | 0.7857 | 0.7944 | 0.7900 | 0.9784 |
75
- | 0.0053 | 9.9068 | 8500 | 0.1273 | 0.7957 | 0.7928 | 0.7942 | 0.9788 |
76
- | 0.0043 | 10.4895 | 9000 | 0.1321 | 0.7772 | 0.7990 | 0.7879 | 0.9782 |
77
- | 0.0039 | 11.0723 | 9500 | 0.1313 | 0.7940 | 0.7945 | 0.7943 | 0.9789 |
78
- | 0.0033 | 11.6550 | 10000 | 0.1361 | 0.7964 | 0.8031 | 0.7997 | 0.9788 |
79
- | 0.0033 | 12.2378 | 10500 | 0.1394 | 0.7828 | 0.8057 | 0.7941 | 0.9785 |
80
- | 0.0032 | 12.8205 | 11000 | 0.1445 | 0.7760 | 0.7928 | 0.7843 | 0.9781 |
81
- | 0.0029 | 13.4033 | 11500 | 0.1358 | 0.7833 | 0.8083 | 0.7956 | 0.9789 |
82
- | 0.003 | 13.9860 | 12000 | 0.1383 | 0.7830 | 0.8031 | 0.7929 | 0.9787 |
83
- | 0.0024 | 14.5688 | 12500 | 0.1403 | 0.7731 | 0.8130 | 0.7925 | 0.9779 |
84
- | 0.0028 | 15.1515 | 13000 | 0.1423 | 0.7998 | 0.7956 | 0.7977 | 0.9789 |
85
- | 0.0021 | 15.7343 | 13500 | 0.1443 | 0.7831 | 0.8062 | 0.7945 | 0.9785 |
86
- | 0.002 | 16.3170 | 14000 | 0.1396 | 0.7815 | 0.8088 | 0.7950 | 0.9785 |
87
- | 0.0018 | 16.8998 | 14500 | 0.1469 | 0.7808 | 0.8072 | 0.7938 | 0.9783 |
88
- | 0.0019 | 17.4825 | 15000 | 0.1450 | 0.7969 | 0.8029 | 0.7999 | 0.9790 |
89
- | 0.0018 | 18.0653 | 15500 | 0.1516 | 0.7906 | 0.8098 | 0.8001 | 0.9788 |
90
- | 0.0011 | 18.6480 | 16000 | 0.1591 | 0.7810 | 0.8093 | 0.7949 | 0.9787 |
91
- | 0.0014 | 19.2308 | 16500 | 0.1561 | 0.8002 | 0.8051 | 0.8026 | 0.9791 |
92
- | 0.0011 | 19.8135 | 17000 | 0.1568 | 0.7901 | 0.8093 | 0.7996 | 0.9788 |
93
- | 0.0013 | 20.3963 | 17500 | 0.1573 | 0.7980 | 0.7950 | 0.7965 | 0.9790 |
94
- | 0.001 | 20.9790 | 18000 | 0.1566 | 0.7987 | 0.7915 | 0.7951 | 0.9787 |
95
- | 0.001 | 21.5618 | 18500 | 0.1594 | 0.7926 | 0.8041 | 0.7983 | 0.9787 |
96
- | 0.001 | 22.1445 | 19000 | 0.1616 | 0.8068 | 0.7899 | 0.7983 | 0.9790 |
97
- | 0.0011 | 22.7273 | 19500 | 0.1644 | 0.7908 | 0.8098 | 0.8002 | 0.9788 |
98
- | 0.0007 | 23.3100 | 20000 | 0.1628 | 0.7870 | 0.8018 | 0.7943 | 0.9788 |
99
- | 0.0008 | 23.8928 | 20500 | 0.1615 | 0.7960 | 0.7995 | 0.7977 | 0.9787 |
100
- | 0.0006 | 24.4755 | 21000 | 0.1614 | 0.8002 | 0.7993 | 0.7998 | 0.9790 |
101
- | 0.0007 | 25.0583 | 21500 | 0.1612 | 0.8001 | 0.7979 | 0.7990 | 0.9791 |
102
- | 0.0005 | 25.6410 | 22000 | 0.1627 | 0.7940 | 0.8104 | 0.8021 | 0.9791 |
103
- | 0.0005 | 26.2238 | 22500 | 0.1664 | 0.7945 | 0.8062 | 0.8003 | 0.9793 |
104
- | 0.0005 | 26.8065 | 23000 | 0.1663 | 0.7871 | 0.8175 | 0.8020 | 0.9792 |
105
- | 0.0004 | 27.3893 | 23500 | 0.1691 | 0.8037 | 0.8091 | 0.8064 | 0.9796 |
106
- | 0.0005 | 27.9720 | 24000 | 0.1684 | 0.7979 | 0.8087 | 0.8032 | 0.9794 |
107
- | 0.0003 | 28.5548 | 24500 | 0.1686 | 0.7999 | 0.8064 | 0.8031 | 0.9793 |
108
- | 0.0004 | 29.1375 | 25000 | 0.1692 | 0.8005 | 0.8081 | 0.8043 | 0.9794 |
109
- | 0.0003 | 29.7203 | 25500 | 0.1692 | 0.7997 | 0.8083 | 0.8040 | 0.9794 |
110
 
111
 
112
  ### Framework versions
 
1
  ---
 
2
  library_name: transformers
3
  license: mit
4
+ base_model: FacebookAI/xlm-roberta-base
5
+ tags:
6
+ - generated_from_trainer
7
  metrics:
8
  - precision
9
  - recall
10
  - f1
11
  - accuracy
 
 
12
  model-index:
13
  - name: scenario-non-kd-pre-ner-full-xlmr_data-univner_half44
14
  results: []
 
21
 
22
  This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 0.1667
25
+ - Precision: 0.8
26
+ - Recall: 0.8085
27
+ - F1: 0.8042
28
  - Accuracy: 0.9794
29
 
30
  ## Model description
 
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
  |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
+ | 0.1407 | 0.5828 | 500 | 0.0805 | 0.7146 | 0.7383 | 0.7262 | 0.9736 |
60
+ | 0.0685 | 1.1655 | 1000 | 0.0771 | 0.7453 | 0.7904 | 0.7672 | 0.9766 |
61
+ | 0.0504 | 1.7483 | 1500 | 0.0763 | 0.7569 | 0.7948 | 0.7754 | 0.9778 |
62
+ | 0.0384 | 2.3310 | 2000 | 0.0867 | 0.7371 | 0.7931 | 0.7641 | 0.9760 |
63
+ | 0.0306 | 2.9138 | 2500 | 0.0880 | 0.7501 | 0.8074 | 0.7777 | 0.9768 |
64
+ | 0.0223 | 3.4965 | 3000 | 0.0928 | 0.7585 | 0.8097 | 0.7833 | 0.9775 |
65
+ | 0.0202 | 4.0793 | 3500 | 0.0958 | 0.7641 | 0.7971 | 0.7803 | 0.9777 |
66
+ | 0.0151 | 4.6620 | 4000 | 0.0985 | 0.7690 | 0.8044 | 0.7863 | 0.9778 |
67
+ | 0.0134 | 5.2448 | 4500 | 0.1051 | 0.7857 | 0.7963 | 0.7910 | 0.9787 |
68
+ | 0.0112 | 5.8275 | 5000 | 0.1080 | 0.7677 | 0.8016 | 0.7843 | 0.9786 |
69
+ | 0.0096 | 6.4103 | 5500 | 0.1158 | 0.7698 | 0.8083 | 0.7886 | 0.9781 |
70
+ | 0.0092 | 6.9930 | 6000 | 0.1130 | 0.7857 | 0.8009 | 0.7932 | 0.9783 |
71
+ | 0.0074 | 7.5758 | 6500 | 0.1161 | 0.7749 | 0.8068 | 0.7906 | 0.9785 |
72
+ | 0.0067 | 8.1585 | 7000 | 0.1194 | 0.7887 | 0.7922 | 0.7905 | 0.9783 |
73
+ | 0.0058 | 8.7413 | 7500 | 0.1179 | 0.7796 | 0.8129 | 0.7959 | 0.9786 |
74
+ | 0.0053 | 9.3240 | 8000 | 0.1266 | 0.7824 | 0.8088 | 0.7954 | 0.9782 |
75
+ | 0.0049 | 9.9068 | 8500 | 0.1273 | 0.7858 | 0.7960 | 0.7909 | 0.9786 |
76
+ | 0.0043 | 10.4895 | 9000 | 0.1301 | 0.7965 | 0.7977 | 0.7971 | 0.9789 |
77
+ | 0.0041 | 11.0723 | 9500 | 0.1289 | 0.7992 | 0.7941 | 0.7966 | 0.9784 |
78
+ | 0.0035 | 11.6550 | 10000 | 0.1344 | 0.7904 | 0.8088 | 0.7995 | 0.9786 |
79
+ | 0.0033 | 12.2378 | 10500 | 0.1391 | 0.7889 | 0.8032 | 0.7960 | 0.9786 |
80
+ | 0.0032 | 12.8205 | 11000 | 0.1431 | 0.7642 | 0.8096 | 0.7862 | 0.9777 |
81
+ | 0.0029 | 13.4033 | 11500 | 0.1359 | 0.8006 | 0.7969 | 0.7987 | 0.9787 |
82
+ | 0.0028 | 13.9860 | 12000 | 0.1393 | 0.7874 | 0.8137 | 0.8003 | 0.9790 |
83
+ | 0.0023 | 14.5688 | 12500 | 0.1426 | 0.7907 | 0.8012 | 0.7959 | 0.9787 |
84
+ | 0.0022 | 15.1515 | 13000 | 0.1441 | 0.7945 | 0.8096 | 0.8020 | 0.9791 |
85
+ | 0.002 | 15.7343 | 13500 | 0.1498 | 0.7860 | 0.8041 | 0.7950 | 0.9782 |
86
+ | 0.0023 | 16.3170 | 14000 | 0.1442 | 0.7844 | 0.8090 | 0.7965 | 0.9787 |
87
+ | 0.0016 | 16.8998 | 14500 | 0.1534 | 0.7943 | 0.8080 | 0.8011 | 0.9789 |
88
+ | 0.0017 | 17.4825 | 15000 | 0.1483 | 0.7915 | 0.8019 | 0.7967 | 0.9789 |
89
+ | 0.0017 | 18.0653 | 15500 | 0.1521 | 0.8066 | 0.7932 | 0.7999 | 0.9791 |
90
+ | 0.0014 | 18.6480 | 16000 | 0.1517 | 0.7984 | 0.8022 | 0.8003 | 0.9792 |
91
+ | 0.0014 | 19.2308 | 16500 | 0.1549 | 0.7820 | 0.8136 | 0.7975 | 0.9789 |
92
+ | 0.0011 | 19.8135 | 17000 | 0.1546 | 0.7980 | 0.8035 | 0.8007 | 0.9792 |
93
+ | 0.0012 | 20.3963 | 17500 | 0.1601 | 0.7842 | 0.8062 | 0.7950 | 0.9785 |
94
+ | 0.0011 | 20.9790 | 18000 | 0.1596 | 0.7830 | 0.8012 | 0.7920 | 0.9785 |
95
+ | 0.0009 | 21.5618 | 18500 | 0.1616 | 0.7911 | 0.8116 | 0.8012 | 0.9788 |
96
+ | 0.0012 | 22.1445 | 19000 | 0.1617 | 0.7834 | 0.8087 | 0.7958 | 0.9782 |
97
+ | 0.0008 | 22.7273 | 19500 | 0.1621 | 0.7917 | 0.8104 | 0.8009 | 0.9792 |
98
+ | 0.0009 | 23.3100 | 20000 | 0.1637 | 0.7927 | 0.8010 | 0.7968 | 0.9786 |
99
+ | 0.0007 | 23.8928 | 20500 | 0.1637 | 0.7720 | 0.8126 | 0.7918 | 0.9784 |
100
+ | 0.0006 | 24.4755 | 21000 | 0.1628 | 0.8003 | 0.7995 | 0.7999 | 0.9791 |
101
+ | 0.0007 | 25.0583 | 21500 | 0.1635 | 0.7904 | 0.8094 | 0.7998 | 0.9789 |
102
+ | 0.0005 | 25.6410 | 22000 | 0.1651 | 0.7942 | 0.8121 | 0.8031 | 0.9793 |
103
+ | 0.0005 | 26.2238 | 22500 | 0.1652 | 0.7958 | 0.8114 | 0.8035 | 0.9792 |
104
+ | 0.0005 | 26.8065 | 23000 | 0.1653 | 0.7936 | 0.8087 | 0.8011 | 0.9792 |
105
+ | 0.0005 | 27.3893 | 23500 | 0.1661 | 0.8005 | 0.8097 | 0.8050 | 0.9794 |
106
+ | 0.0004 | 27.9720 | 24000 | 0.1661 | 0.7953 | 0.8100 | 0.8026 | 0.9792 |
107
+ | 0.0004 | 28.5548 | 24500 | 0.1668 | 0.7940 | 0.8108 | 0.8023 | 0.9793 |
108
+ | 0.0004 | 29.1375 | 25000 | 0.1666 | 0.7944 | 0.8081 | 0.8012 | 0.9791 |
109
+ | 0.0003 | 29.7203 | 25500 | 0.1667 | 0.8 | 0.8085 | 0.8042 | 0.9794 |
110
 
111
 
112
  ### Framework versions
eval_result_ner.json CHANGED
@@ -1 +1 @@
1
- {"ceb_gja": {"precision": 0.44155844155844154, "recall": 0.6938775510204082, "f1": 0.5396825396825397, "accuracy": 0.9536679536679536}, "en_pud": {"precision": 0.7821393523061825, "recall": 0.7413953488372093, "f1": 0.761222540592168, "accuracy": 0.9770494899886664}, "de_pud": {"precision": 0.7226415094339622, "recall": 0.737247353224254, "f1": 0.7298713673177702, "accuracy": 0.9713093619614646}, "pt_pud": {"precision": 0.8009478672985783, "recall": 0.7688808007279345, "f1": 0.7845868152274839, "accuracy": 0.9790233690776263}, "ru_pud": {"precision": 0.6575984990619137, "recall": 0.6766409266409267, "f1": 0.6669838249286395, "accuracy": 0.967036941358822}, "sv_pud": {"precision": 0.8170974155069582, "recall": 0.7988338192419825, "f1": 0.8078624078624078, "accuracy": 0.980866009645628}, "tl_trg": {"precision": 0.7307692307692307, "recall": 0.8260869565217391, "f1": 0.7755102040816326, "accuracy": 0.9850136239782016}, "tl_ugnayan": {"precision": 0.5, "recall": 0.6363636363636364, "f1": 0.56, "accuracy": 0.9662716499544212}, "zh_gsd": {"precision": 0.8070866141732284, "recall": 0.8018252933507171, "f1": 0.8044473512099413, "accuracy": 0.9731102231102231}, "zh_gsdsimp": {"precision": 0.7866666666666666, "recall": 0.7732634338138925, "f1": 0.7799074686054197, "accuracy": 0.970945720945721}, "hr_set": {"precision": 0.88, "recall": 0.8937990021382751, "f1": 0.8868458274398867, "accuracy": 0.9869332234130256}, "da_ddt": {"precision": 0.8029556650246306, "recall": 0.7293064876957495, "f1": 0.7643610785463072, "accuracy": 0.9820413049985034}, "en_ewt": {"precision": 0.7806324110671937, "recall": 0.7261029411764706, "f1": 0.7523809523809524, "accuracy": 0.9753356974937244}, "pt_bosque": {"precision": 0.7879078694817658, "recall": 0.6757201646090535, "f1": 0.7275143996455471, "accuracy": 0.9738443703811042}, "sr_set": {"precision": 0.8980070339976554, "recall": 0.9043683589138135, "f1": 0.9011764705882354, "accuracy": 0.9861658348655985}, "sk_snk": {"precision": 0.6898096304591266, "recall": 0.673224043715847, "f1": 0.6814159292035399, "accuracy": 0.9559516331658291}, "sv_talbanken": {"precision": 0.7918552036199095, "recall": 0.8928571428571429, "f1": 0.8393285371702638, "accuracy": 0.9969573538793738}}
 
1
+ {"ceb_gja": {"precision": 0.49333333333333335, "recall": 0.7551020408163265, "f1": 0.5967741935483871, "accuracy": 0.9606177606177606}, "en_pud": {"precision": 0.7613967022308439, "recall": 0.7302325581395349, "f1": 0.7454890788224122, "accuracy": 0.9756327918398187}, "de_pud": {"precision": 0.7201138519924098, "recall": 0.7305101058710298, "f1": 0.7252747252747253, "accuracy": 0.9702311190286438}, "pt_pud": {"precision": 0.7959183673469388, "recall": 0.7807097361237488, "f1": 0.7882406982085438, "accuracy": 0.9788952022899132}, "ru_pud": {"precision": 0.6613508442776735, "recall": 0.6805019305019305, "f1": 0.6707897240723121, "accuracy": 0.9663652802893309}, "sv_pud": {"precision": 0.8037661050545094, "recall": 0.7881438289601554, "f1": 0.7958783120706574, "accuracy": 0.9791885091214091}, "tl_trg": {"precision": 0.7142857142857143, "recall": 0.8695652173913043, "f1": 0.7843137254901961, "accuracy": 0.9850136239782016}, "tl_ugnayan": {"precision": 0.5, "recall": 0.696969696969697, "f1": 0.5822784810126582, "accuracy": 0.9653600729261622}, "zh_gsd": {"precision": 0.8063241106719368, "recall": 0.7979139504563233, "f1": 0.8020969855832241, "accuracy": 0.9727772227772228}, "zh_gsdsimp": {"precision": 0.8041095890410959, "recall": 0.7693315858453473, "f1": 0.7863362357669124, "accuracy": 0.9698634698634698}, "hr_set": {"precision": 0.8829268292682927, "recall": 0.9030648610121169, "f1": 0.8928823114869627, "accuracy": 0.9868920032976093}, "da_ddt": {"precision": 0.83, "recall": 0.7427293064876958, "f1": 0.7839433293978748, "accuracy": 0.9828394692207921}, "en_ewt": {"precision": 0.7535680304471931, "recall": 0.7279411764705882, "f1": 0.7405329593267882, "accuracy": 0.974299717097661}, "pt_bosque": {"precision": 0.7605893186003683, "recall": 0.679835390946502, "f1": 0.717948717948718, "accuracy": 0.9737356904796406}, "sr_set": {"precision": 0.8851113716295428, "recall": 0.8913813459268005, "f1": 0.888235294117647, "accuracy": 0.985290254793801}, "sk_snk": {"precision": 0.6951364175563464, "recall": 0.6404371584699453, "f1": 0.6666666666666666, "accuracy": 0.9550879396984925}, "sv_talbanken": {"precision": 0.8093023255813954, "recall": 0.8877551020408163, "f1": 0.8467153284671532, "accuracy": 0.9971045786916621}}
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b26f2974bfb9657e83be944b2972da40403ccb9aeaa9a39bfad9c676cefae08e
3
  size 939737140
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c60547ce6f3a649f42170ef5f09a3cd9a68f3cff0379f33de9096d092a7fd90d
3
  size 939737140
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b4d331eaea4cfc5faa7dabd2d16d5b830e3c081271a5e949c2b5a0d71bf21ea3
3
  size 5304
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5ba27f7b81bce5724cadb8d4f2e6c6215bc3a04cefeed32131618a3e4cb9db37
3
  size 5304