haryoaw commited on
Commit
926076f
1 Parent(s): bd4c91d

Initial Commit

Browse files
Files changed (4) hide show
  1. README.md +58 -58
  2. eval_result_ner.json +1 -1
  3. model.safetensors +1 -1
  4. training_args.bin +1 -1
README.md CHANGED
@@ -1,14 +1,14 @@
1
  ---
2
- base_model: FacebookAI/xlm-roberta-base
3
  library_name: transformers
4
  license: mit
 
 
 
5
  metrics:
6
  - precision
7
  - recall
8
  - f1
9
  - accuracy
10
- tags:
11
- - generated_from_trainer
12
  model-index:
13
  - name: scenario-non-kd-pre-ner-full-xlmr_data-univner_half55
14
  results: []
@@ -21,11 +21,11 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.1686
25
  - Precision: 0.7969
26
- - Recall: 0.8096
27
- - F1: 0.8032
28
- - Accuracy: 0.9792
29
 
30
  ## Model description
31
 
@@ -56,57 +56,57 @@ The following hyperparameters were used during training:
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
  |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
- | 0.1436 | 0.5828 | 500 | 0.0827 | 0.7033 | 0.7654 | 0.7330 | 0.9732 |
60
- | 0.0677 | 1.1655 | 1000 | 0.0781 | 0.7548 | 0.7932 | 0.7735 | 0.9771 |
61
- | 0.053 | 1.7483 | 1500 | 0.0807 | 0.7314 | 0.8188 | 0.7726 | 0.9761 |
62
- | 0.0363 | 2.3310 | 2000 | 0.0846 | 0.7659 | 0.7957 | 0.7805 | 0.9781 |
63
- | 0.0315 | 2.9138 | 2500 | 0.0876 | 0.7600 | 0.7971 | 0.7781 | 0.9770 |
64
- | 0.0224 | 3.4965 | 3000 | 0.0919 | 0.7735 | 0.7930 | 0.7831 | 0.9779 |
65
- | 0.0194 | 4.0793 | 3500 | 0.0984 | 0.7824 | 0.7997 | 0.7910 | 0.9784 |
66
- | 0.0145 | 4.6620 | 4000 | 0.1060 | 0.7628 | 0.7967 | 0.7794 | 0.9769 |
67
- | 0.0131 | 5.2448 | 4500 | 0.1085 | 0.7876 | 0.7850 | 0.7863 | 0.9776 |
68
- | 0.0111 | 5.8275 | 5000 | 0.1093 | 0.7960 | 0.7917 | 0.7938 | 0.9780 |
69
- | 0.0092 | 6.4103 | 5500 | 0.1186 | 0.7741 | 0.7907 | 0.7823 | 0.9774 |
70
- | 0.0091 | 6.9930 | 6000 | 0.1192 | 0.7833 | 0.8015 | 0.7923 | 0.9783 |
71
- | 0.0064 | 7.5758 | 6500 | 0.1210 | 0.7845 | 0.7937 | 0.7891 | 0.9778 |
72
- | 0.0067 | 8.1585 | 7000 | 0.1272 | 0.7857 | 0.7901 | 0.7879 | 0.9781 |
73
- | 0.0063 | 8.7413 | 7500 | 0.1212 | 0.7838 | 0.7990 | 0.7913 | 0.9784 |
74
- | 0.0056 | 9.3240 | 8000 | 0.1289 | 0.7856 | 0.7971 | 0.7913 | 0.9779 |
75
- | 0.0056 | 9.9068 | 8500 | 0.1292 | 0.7902 | 0.7960 | 0.7931 | 0.9788 |
76
- | 0.0042 | 10.4895 | 9000 | 0.1331 | 0.7940 | 0.7892 | 0.7916 | 0.9784 |
77
- | 0.004 | 11.0723 | 9500 | 0.1403 | 0.7948 | 0.7904 | 0.7926 | 0.9785 |
78
- | 0.0037 | 11.6550 | 10000 | 0.1354 | 0.7863 | 0.7911 | 0.7887 | 0.9783 |
79
- | 0.0035 | 12.2378 | 10500 | 0.1372 | 0.7776 | 0.8093 | 0.7931 | 0.9781 |
80
- | 0.0032 | 12.8205 | 11000 | 0.1394 | 0.7859 | 0.7935 | 0.7897 | 0.9783 |
81
- | 0.0025 | 13.4033 | 11500 | 0.1389 | 0.7809 | 0.8032 | 0.7919 | 0.9782 |
82
- | 0.0028 | 13.9860 | 12000 | 0.1407 | 0.7812 | 0.8009 | 0.7909 | 0.9780 |
83
- | 0.0023 | 14.5688 | 12500 | 0.1431 | 0.7849 | 0.8018 | 0.7932 | 0.9783 |
84
- | 0.0022 | 15.1515 | 13000 | 0.1519 | 0.7779 | 0.8016 | 0.7896 | 0.9779 |
85
- | 0.0022 | 15.7343 | 13500 | 0.1466 | 0.79 | 0.7979 | 0.7939 | 0.9785 |
86
- | 0.002 | 16.3170 | 14000 | 0.1511 | 0.7781 | 0.8137 | 0.7955 | 0.9780 |
87
- | 0.0017 | 16.8998 | 14500 | 0.1502 | 0.7869 | 0.8033 | 0.7950 | 0.9782 |
88
- | 0.0014 | 17.4825 | 15000 | 0.1559 | 0.7862 | 0.8038 | 0.7949 | 0.9784 |
89
- | 0.0015 | 18.0653 | 15500 | 0.1628 | 0.7790 | 0.8084 | 0.7934 | 0.9785 |
90
- | 0.0018 | 18.6480 | 16000 | 0.1563 | 0.7881 | 0.8042 | 0.7961 | 0.9782 |
91
- | 0.0016 | 19.2308 | 16500 | 0.1568 | 0.7883 | 0.7983 | 0.7933 | 0.9780 |
92
- | 0.0011 | 19.8135 | 17000 | 0.1671 | 0.7858 | 0.8062 | 0.7959 | 0.9783 |
93
- | 0.0012 | 20.3963 | 17500 | 0.1577 | 0.7965 | 0.8045 | 0.8005 | 0.9789 |
94
- | 0.0011 | 20.9790 | 18000 | 0.1602 | 0.7868 | 0.8080 | 0.7973 | 0.9786 |
95
- | 0.001 | 21.5618 | 18500 | 0.1626 | 0.8023 | 0.7941 | 0.7982 | 0.9785 |
96
- | 0.0008 | 22.1445 | 19000 | 0.1641 | 0.7911 | 0.8009 | 0.7960 | 0.9782 |
97
- | 0.0008 | 22.7273 | 19500 | 0.1639 | 0.7944 | 0.8077 | 0.8010 | 0.9784 |
98
- | 0.0008 | 23.3100 | 20000 | 0.1643 | 0.7825 | 0.8080 | 0.7950 | 0.9780 |
99
- | 0.0007 | 23.8928 | 20500 | 0.1636 | 0.7908 | 0.8081 | 0.7993 | 0.9785 |
100
- | 0.0006 | 24.4755 | 21000 | 0.1675 | 0.7921 | 0.8094 | 0.8007 | 0.9790 |
101
- | 0.0005 | 25.0583 | 21500 | 0.1654 | 0.7942 | 0.8085 | 0.8013 | 0.9790 |
102
- | 0.0008 | 25.6410 | 22000 | 0.1653 | 0.7984 | 0.7989 | 0.7986 | 0.9790 |
103
- | 0.0005 | 26.2238 | 22500 | 0.1663 | 0.7979 | 0.8067 | 0.8023 | 0.9791 |
104
- | 0.0005 | 26.8065 | 23000 | 0.1652 | 0.7966 | 0.8083 | 0.8024 | 0.9789 |
105
- | 0.0004 | 27.3893 | 23500 | 0.1664 | 0.7940 | 0.8064 | 0.8001 | 0.9790 |
106
- | 0.0004 | 27.9720 | 24000 | 0.1689 | 0.7975 | 0.8041 | 0.8008 | 0.9789 |
107
- | 0.0003 | 28.5548 | 24500 | 0.1713 | 0.7893 | 0.8117 | 0.8003 | 0.9787 |
108
- | 0.0004 | 29.1375 | 25000 | 0.1693 | 0.7934 | 0.8096 | 0.8014 | 0.9791 |
109
- | 0.0004 | 29.7203 | 25500 | 0.1686 | 0.7969 | 0.8096 | 0.8032 | 0.9792 |
110
 
111
 
112
  ### Framework versions
 
1
  ---
 
2
  library_name: transformers
3
  license: mit
4
+ base_model: FacebookAI/xlm-roberta-base
5
+ tags:
6
+ - generated_from_trainer
7
  metrics:
8
  - precision
9
  - recall
10
  - f1
11
  - accuracy
 
 
12
  model-index:
13
  - name: scenario-non-kd-pre-ner-full-xlmr_data-univner_half55
14
  results: []
 
21
 
22
  This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
23
  It achieves the following results on the evaluation set:
24
+ - Loss: 0.1716
25
  - Precision: 0.7969
26
+ - Recall: 0.7989
27
+ - F1: 0.7979
28
+ - Accuracy: 0.9786
29
 
30
  ## Model description
31
 
 
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
58
  |:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
59
+ | 0.144 | 0.5828 | 500 | 0.0813 | 0.6995 | 0.7625 | 0.7297 | 0.9732 |
60
+ | 0.0678 | 1.1655 | 1000 | 0.0775 | 0.7546 | 0.7816 | 0.7678 | 0.9772 |
61
+ | 0.0531 | 1.7483 | 1500 | 0.0810 | 0.7244 | 0.8137 | 0.7665 | 0.9760 |
62
+ | 0.0368 | 2.3310 | 2000 | 0.0831 | 0.7742 | 0.7919 | 0.7830 | 0.9782 |
63
+ | 0.0311 | 2.9138 | 2500 | 0.0871 | 0.7695 | 0.7917 | 0.7804 | 0.9773 |
64
+ | 0.0222 | 3.4965 | 3000 | 0.0916 | 0.7856 | 0.7860 | 0.7858 | 0.9785 |
65
+ | 0.0196 | 4.0793 | 3500 | 0.0982 | 0.7778 | 0.7990 | 0.7883 | 0.9781 |
66
+ | 0.0145 | 4.6620 | 4000 | 0.1004 | 0.7769 | 0.7927 | 0.7847 | 0.9778 |
67
+ | 0.0131 | 5.2448 | 4500 | 0.1058 | 0.7719 | 0.7878 | 0.7798 | 0.9769 |
68
+ | 0.0108 | 5.8275 | 5000 | 0.1116 | 0.7830 | 0.7919 | 0.7875 | 0.9775 |
69
+ | 0.0086 | 6.4103 | 5500 | 0.1137 | 0.7743 | 0.8018 | 0.7878 | 0.9778 |
70
+ | 0.009 | 6.9930 | 6000 | 0.1180 | 0.7739 | 0.8078 | 0.7905 | 0.9778 |
71
+ | 0.0066 | 7.5758 | 6500 | 0.1189 | 0.7761 | 0.8090 | 0.7922 | 0.9782 |
72
+ | 0.007 | 8.1585 | 7000 | 0.1281 | 0.7813 | 0.7869 | 0.7841 | 0.9777 |
73
+ | 0.0059 | 8.7413 | 7500 | 0.1222 | 0.7781 | 0.8070 | 0.7923 | 0.9785 |
74
+ | 0.0057 | 9.3240 | 8000 | 0.1298 | 0.7694 | 0.8124 | 0.7903 | 0.9781 |
75
+ | 0.0053 | 9.9068 | 8500 | 0.1260 | 0.7919 | 0.7951 | 0.7935 | 0.9787 |
76
+ | 0.0043 | 10.4895 | 9000 | 0.1356 | 0.7719 | 0.8062 | 0.7887 | 0.9778 |
77
+ | 0.0042 | 11.0723 | 9500 | 0.1309 | 0.7850 | 0.7982 | 0.7915 | 0.9786 |
78
+ | 0.0042 | 11.6550 | 10000 | 0.1356 | 0.7789 | 0.7922 | 0.7855 | 0.9779 |
79
+ | 0.0034 | 12.2378 | 10500 | 0.1367 | 0.7781 | 0.8013 | 0.7895 | 0.9782 |
80
+ | 0.0032 | 12.8205 | 11000 | 0.1409 | 0.7732 | 0.8123 | 0.7923 | 0.9781 |
81
+ | 0.0022 | 13.4033 | 11500 | 0.1498 | 0.7707 | 0.8068 | 0.7883 | 0.9778 |
82
+ | 0.0031 | 13.9860 | 12000 | 0.1454 | 0.7704 | 0.8133 | 0.7913 | 0.9781 |
83
+ | 0.0022 | 14.5688 | 12500 | 0.1436 | 0.7922 | 0.7934 | 0.7928 | 0.9784 |
84
+ | 0.0024 | 15.1515 | 13000 | 0.1461 | 0.7734 | 0.8077 | 0.7902 | 0.9778 |
85
+ | 0.0023 | 15.7343 | 13500 | 0.1465 | 0.7918 | 0.7996 | 0.7957 | 0.9786 |
86
+ | 0.0017 | 16.3170 | 14000 | 0.1506 | 0.7838 | 0.8022 | 0.7929 | 0.9783 |
87
+ | 0.0018 | 16.8998 | 14500 | 0.1466 | 0.7953 | 0.7973 | 0.7963 | 0.9787 |
88
+ | 0.0018 | 17.4825 | 15000 | 0.1502 | 0.7941 | 0.8012 | 0.7976 | 0.9789 |
89
+ | 0.0019 | 18.0653 | 15500 | 0.1515 | 0.7871 | 0.8052 | 0.7960 | 0.9786 |
90
+ | 0.0018 | 18.6480 | 16000 | 0.1501 | 0.8062 | 0.7780 | 0.7918 | 0.9782 |
91
+ | 0.0016 | 19.2308 | 16500 | 0.1547 | 0.7887 | 0.7963 | 0.7924 | 0.9780 |
92
+ | 0.001 | 19.8135 | 17000 | 0.1650 | 0.7819 | 0.8070 | 0.7942 | 0.9778 |
93
+ | 0.0009 | 20.3963 | 17500 | 0.1612 | 0.7971 | 0.7833 | 0.7901 | 0.9780 |
94
+ | 0.0012 | 20.9790 | 18000 | 0.1569 | 0.7903 | 0.8023 | 0.7962 | 0.9785 |
95
+ | 0.0008 | 21.5618 | 18500 | 0.1640 | 0.7787 | 0.8081 | 0.7931 | 0.9779 |
96
+ | 0.0009 | 22.1445 | 19000 | 0.1640 | 0.7950 | 0.7924 | 0.7937 | 0.9781 |
97
+ | 0.0008 | 22.7273 | 19500 | 0.1650 | 0.7982 | 0.8023 | 0.8003 | 0.9789 |
98
+ | 0.0007 | 23.3100 | 20000 | 0.1635 | 0.7962 | 0.8022 | 0.7992 | 0.9787 |
99
+ | 0.0008 | 23.8928 | 20500 | 0.1678 | 0.7852 | 0.8005 | 0.7927 | 0.9784 |
100
+ | 0.0006 | 24.4755 | 21000 | 0.1686 | 0.7970 | 0.8025 | 0.7997 | 0.9788 |
101
+ | 0.0006 | 25.0583 | 21500 | 0.1686 | 0.7963 | 0.7970 | 0.7967 | 0.9785 |
102
+ | 0.0006 | 25.6410 | 22000 | 0.1706 | 0.7941 | 0.7948 | 0.7945 | 0.9784 |
103
+ | 0.0005 | 26.2238 | 22500 | 0.1681 | 0.7935 | 0.7963 | 0.7949 | 0.9785 |
104
+ | 0.0005 | 26.8065 | 23000 | 0.1688 | 0.8008 | 0.7938 | 0.7973 | 0.9788 |
105
+ | 0.0004 | 27.3893 | 23500 | 0.1700 | 0.7898 | 0.7996 | 0.7947 | 0.9784 |
106
+ | 0.0005 | 27.9720 | 24000 | 0.1708 | 0.7914 | 0.8096 | 0.8004 | 0.9786 |
107
+ | 0.0003 | 28.5548 | 24500 | 0.1713 | 0.7965 | 0.7979 | 0.7972 | 0.9785 |
108
+ | 0.0004 | 29.1375 | 25000 | 0.1710 | 0.7951 | 0.8010 | 0.7980 | 0.9786 |
109
+ | 0.0005 | 29.7203 | 25500 | 0.1716 | 0.7969 | 0.7989 | 0.7979 | 0.9786 |
110
 
111
 
112
  ### Framework versions
eval_result_ner.json CHANGED
@@ -1 +1 @@
1
- {"ceb_gja": {"precision": 0.44047619047619047, "recall": 0.7551020408163265, "f1": 0.556390977443609, "accuracy": 0.9513513513513514}, "en_pud": {"precision": 0.7838899803536346, "recall": 0.7423255813953489, "f1": 0.7625418060200668, "accuracy": 0.9771439365319229}, "de_pud": {"precision": 0.7178265014299333, "recall": 0.7247353224254091, "f1": 0.7212643678160919, "accuracy": 0.9702779991561578}, "pt_pud": {"precision": 0.7844990548204159, "recall": 0.7552320291173794, "f1": 0.7695873898933705, "accuracy": 0.9776989789379246}, "ru_pud": {"precision": 0.667621776504298, "recall": 0.6747104247104247, "f1": 0.671147383581373, "accuracy": 0.9667786101782485}, "sv_pud": {"precision": 0.814629258517034, "recall": 0.7900874635568513, "f1": 0.8021706956092748, "accuracy": 0.9790836653386454}, "tl_trg": {"precision": 0.7586206896551724, "recall": 0.9565217391304348, "f1": 0.8461538461538461, "accuracy": 0.9877384196185286}, "tl_ugnayan": {"precision": 0.45, "recall": 0.5454545454545454, "f1": 0.4931506849315069, "accuracy": 0.9626253418413856}, "zh_gsd": {"precision": 0.803921568627451, "recall": 0.8018252933507171, "f1": 0.8028720626631853, "accuracy": 0.9721944721944722}, "zh_gsdsimp": {"precision": 0.8267929634641408, "recall": 0.8007863695937091, "f1": 0.8135818908122504, "accuracy": 0.9734432234432234}, "hr_set": {"precision": 0.8757807078417765, "recall": 0.8995010691375623, "f1": 0.8874824191279888, "accuracy": 0.9869744435284419}, "da_ddt": {"precision": 0.8220551378446115, "recall": 0.7337807606263982, "f1": 0.7754137115839244, "accuracy": 0.9830390102763643}, "en_ewt": {"precision": 0.7927565392354124, "recall": 0.7242647058823529, "f1": 0.7569644572526417, "accuracy": 0.975375542893573}, "pt_bosque": {"precision": 0.7658986175115208, "recall": 0.6839506172839506, "f1": 0.7226086956521738, "accuracy": 0.9735183306767136}, "sr_set": {"precision": 0.8946745562130177, "recall": 0.8925619834710744, "f1": 0.8936170212765957, "accuracy": 0.9852026967866211}, "sk_snk": {"precision": 0.6922183507549361, "recall": 0.6513661202185792, "f1": 0.6711711711711712, "accuracy": 0.9561086683417085}, "sv_talbanken": {"precision": 0.7688888888888888, "recall": 0.8826530612244898, "f1": 0.821852731591449, "accuracy": 0.9966629042547971}}
 
1
+ {"ceb_gja": {"precision": 0.4457831325301205, "recall": 0.7551020408163265, "f1": 0.5606060606060607, "accuracy": 0.9544401544401544}, "en_pud": {"precision": 0.7670454545454546, "recall": 0.7534883720930232, "f1": 0.7602064758329422, "accuracy": 0.9765300340007556}, "de_pud": {"precision": 0.7195004803073968, "recall": 0.7208854667949952, "f1": 0.7201923076923077, "accuracy": 0.9703717594111856}, "pt_pud": {"precision": 0.7764489420423183, "recall": 0.7679708826205641, "f1": 0.7721866422689845, "accuracy": 0.9781262015636348}, "ru_pud": {"precision": 0.6713352007469654, "recall": 0.694015444015444, "f1": 0.6824869482676791, "accuracy": 0.9678636011366571}, "sv_pud": {"precision": 0.8224206349206349, "recall": 0.8056365403304179, "f1": 0.8139420716740304, "accuracy": 0.9814950723422101}, "tl_trg": {"precision": 0.7407407407407407, "recall": 0.8695652173913043, "f1": 0.7999999999999999, "accuracy": 0.9877384196185286}, "tl_ugnayan": {"precision": 0.4473684210526316, "recall": 0.5151515151515151, "f1": 0.4788732394366197, "accuracy": 0.9617137648131268}, "zh_gsd": {"precision": 0.7835443037974683, "recall": 0.8070404172099087, "f1": 0.7951188182402055, "accuracy": 0.9708624708624709}, "zh_gsdsimp": {"precision": 0.8065789473684211, "recall": 0.8034076015727392, "f1": 0.8049901510177282, "accuracy": 0.9725274725274725}, "hr_set": {"precision": 0.8631074606433949, "recall": 0.8987883107626514, "f1": 0.880586592178771, "accuracy": 0.9866859027205276}, "da_ddt": {"precision": 0.8014354066985646, "recall": 0.7494407158836689, "f1": 0.7745664739884394, "accuracy": 0.9833383218597226}, "en_ewt": {"precision": 0.7836538461538461, "recall": 0.7490808823529411, "f1": 0.7659774436090225, "accuracy": 0.9764912140893334}, "pt_bosque": {"precision": 0.7544964028776978, "recall": 0.6905349794238683, "f1": 0.721100128921358, "accuracy": 0.9736632372119982}, "sr_set": {"precision": 0.8968347010550997, "recall": 0.9031877213695395, "f1": 0.9, "accuracy": 0.9866036249014972}, "sk_snk": {"precision": 0.6903954802259887, "recall": 0.66775956284153, "f1": 0.6788888888888889, "accuracy": 0.9563442211055276}, "sv_talbanken": {"precision": 0.8119266055045872, "recall": 0.9030612244897959, "f1": 0.8550724637681159, "accuracy": 0.9970064288168032}}
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5eacafd0977637d555ed4a1d47c52a885e464620b95f874e0d9069194d5910ce
3
  size 939737140
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c936f9ac2b5911cecf47176e40493c2d4ade67799592310a42acc4297246738e
3
  size 939737140
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2507bccd9ff043376f3ef0718d7aba7e9464cbf1412543e1a1e71df107aaee48
3
  size 5304
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4a8c2363b4e974d4b66f9fb4ac56f2780ddb3ddceb45b3f987bcbcd95e4cf554
3
  size 5304