mireiaplalis
commited on
Commit
•
6333a4f
1
Parent(s):
702885a
Training complete
Browse files
README.md
ADDED
@@ -0,0 +1,122 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
base_model: google/electra-base-discriminator
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
metrics:
|
7 |
+
- precision
|
8 |
+
- recall
|
9 |
+
- f1
|
10 |
+
- accuracy
|
11 |
+
model-index:
|
12 |
+
- name: electra-base-discriminatorfinetuned-ner-cadec
|
13 |
+
results: []
|
14 |
+
---
|
15 |
+
|
16 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
17 |
+
should probably proofread and complete it, then remove this comment. -->
|
18 |
+
|
19 |
+
# electra-base-discriminatorfinetuned-ner-cadec
|
20 |
+
|
21 |
+
This model is a fine-tuned version of [google/electra-base-discriminator](https://huggingface.co/google/electra-base-discriminator) on an unknown dataset.
|
22 |
+
It achieves the following results on the evaluation set:
|
23 |
+
- Loss: 0.2903
|
24 |
+
- Precision: 0.6312
|
25 |
+
- Recall: 0.6966
|
26 |
+
- F1: 0.6623
|
27 |
+
- Accuracy: 0.9274
|
28 |
+
- Adr Precision: 0.6070
|
29 |
+
- Adr Recall: 0.6972
|
30 |
+
- Adr F1: 0.6490
|
31 |
+
- Disease Precision: 0.125
|
32 |
+
- Disease Recall: 0.1579
|
33 |
+
- Disease F1: 0.1395
|
34 |
+
- Drug Precision: 0.9464
|
35 |
+
- Drug Recall: 0.9636
|
36 |
+
- Drug F1: 0.9550
|
37 |
+
- Finding Precision: 0.1961
|
38 |
+
- Finding Recall: 0.2222
|
39 |
+
- Finding F1: 0.2083
|
40 |
+
- Symptom Precision: 0.4
|
41 |
+
- Symptom Recall: 0.2222
|
42 |
+
- Symptom F1: 0.2857
|
43 |
+
- B-adr Precision: 0.7540
|
44 |
+
- B-adr Recall: 0.8119
|
45 |
+
- B-adr F1: 0.7819
|
46 |
+
- B-disease Precision: 0.1667
|
47 |
+
- B-disease Recall: 0.1579
|
48 |
+
- B-disease F1: 0.1622
|
49 |
+
- B-drug Precision: 0.9760
|
50 |
+
- B-drug Recall: 0.9879
|
51 |
+
- B-drug F1: 0.9819
|
52 |
+
- B-finding Precision: 0.275
|
53 |
+
- B-finding Recall: 0.2444
|
54 |
+
- B-finding F1: 0.2588
|
55 |
+
- B-symptom Precision: 0.5
|
56 |
+
- B-symptom Recall: 0.24
|
57 |
+
- B-symptom F1: 0.3243
|
58 |
+
- I-adr Precision: 0.6175
|
59 |
+
- I-adr Recall: 0.7020
|
60 |
+
- I-adr F1: 0.6570
|
61 |
+
- I-disease Precision: 0.15
|
62 |
+
- I-disease Recall: 0.2308
|
63 |
+
- I-disease F1: 0.1818
|
64 |
+
- I-drug Precision: 0.9521
|
65 |
+
- I-drug Recall: 0.9636
|
66 |
+
- I-drug F1: 0.9578
|
67 |
+
- I-finding Precision: 0.1622
|
68 |
+
- I-finding Recall: 0.1875
|
69 |
+
- I-finding F1: 0.1739
|
70 |
+
- I-symptom Precision: 0.5
|
71 |
+
- I-symptom Recall: 0.15
|
72 |
+
- I-symptom F1: 0.2308
|
73 |
+
- Macro Avg F1: 0.4710
|
74 |
+
- Weighted Avg F1: 0.7273
|
75 |
+
|
76 |
+
## Model description
|
77 |
+
|
78 |
+
More information needed
|
79 |
+
|
80 |
+
## Intended uses & limitations
|
81 |
+
|
82 |
+
More information needed
|
83 |
+
|
84 |
+
## Training and evaluation data
|
85 |
+
|
86 |
+
More information needed
|
87 |
+
|
88 |
+
## Training procedure
|
89 |
+
|
90 |
+
### Training hyperparameters
|
91 |
+
|
92 |
+
The following hyperparameters were used during training:
|
93 |
+
- learning_rate: 2e-05
|
94 |
+
- train_batch_size: 8
|
95 |
+
- eval_batch_size: 8
|
96 |
+
- seed: 42
|
97 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
98 |
+
- lr_scheduler_type: linear
|
99 |
+
- num_epochs: 10
|
100 |
+
|
101 |
+
### Training results
|
102 |
+
|
103 |
+
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Adr Precision | Adr Recall | Adr F1 | Disease Precision | Disease Recall | Disease F1 | Drug Precision | Drug Recall | Drug F1 | Finding Precision | Finding Recall | Finding F1 | Symptom Precision | Symptom Recall | Symptom F1 | B-adr Precision | B-adr Recall | B-adr F1 | B-disease Precision | B-disease Recall | B-disease F1 | B-drug Precision | B-drug Recall | B-drug F1 | B-finding Precision | B-finding Recall | B-finding F1 | B-symptom Precision | B-symptom Recall | B-symptom F1 | I-adr Precision | I-adr Recall | I-adr F1 | I-disease Precision | I-disease Recall | I-disease F1 | I-drug Precision | I-drug Recall | I-drug F1 | I-finding Precision | I-finding Recall | I-finding F1 | I-symptom Precision | I-symptom Recall | I-symptom F1 | Macro Avg F1 | Weighted Avg F1 |
|
104 |
+
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:-------------:|:----------:|:------:|:-----------------:|:--------------:|:----------:|:--------------:|:-----------:|:-------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:------------:|:---------------:|
|
105 |
+
| No log | 1.0 | 127 | 0.2748 | 0.5316 | 0.6617 | 0.5895 | 0.9167 | 0.4583 | 0.6862 | 0.5496 | 0.0 | 0.0 | 0.0 | 0.8619 | 0.9455 | 0.9017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6301 | 0.8369 | 0.7189 | 0.0 | 0.0 | 0.0 | 0.9527 | 0.9758 | 0.9641 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4811 | 0.6755 | 0.5620 | 0.0 | 0.0 | 0.0 | 0.8674 | 0.9515 | 0.9075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3152 | 0.6433 |
|
106 |
+
| No log | 2.0 | 254 | 0.2396 | 0.5670 | 0.6604 | 0.6101 | 0.9205 | 0.5198 | 0.6752 | 0.5874 | 0.0625 | 0.1053 | 0.0784 | 0.9349 | 0.9576 | 0.9461 | 0.0417 | 0.0222 | 0.0290 | 0.0 | 0.0 | 0.0 | 0.6992 | 0.8119 | 0.7513 | 0.3077 | 0.2105 | 0.25 | 0.9759 | 0.9818 | 0.9789 | 0.3333 | 0.0444 | 0.0784 | 0.0 | 0.0 | 0.0 | 0.5524 | 0.6865 | 0.6122 | 0.0690 | 0.1538 | 0.0952 | 0.9405 | 0.9576 | 0.9489 | 0.25 | 0.1875 | 0.2143 | 0.0 | 0.0 | 0.0 | 0.3929 | 0.6881 |
|
107 |
+
| No log | 3.0 | 381 | 0.2432 | 0.6237 | 0.6767 | 0.6491 | 0.9273 | 0.5965 | 0.6972 | 0.6430 | 0.1163 | 0.2632 | 0.1613 | 0.9006 | 0.9333 | 0.9167 | 0.1667 | 0.0667 | 0.0952 | 0.0 | 0.0 | 0.0 | 0.7559 | 0.8023 | 0.7784 | 0.1786 | 0.2632 | 0.2128 | 0.9702 | 0.9879 | 0.9790 | 0.2667 | 0.0889 | 0.1333 | 0.0 | 0.0 | 0.0 | 0.6216 | 0.7108 | 0.6632 | 0.1951 | 0.6154 | 0.2963 | 0.9112 | 0.9333 | 0.9222 | 0.1429 | 0.0312 | 0.0513 | 0.0 | 0.0 | 0.0 | 0.4036 | 0.7100 |
|
108 |
+
| 0.2876 | 4.0 | 508 | 0.2490 | 0.6259 | 0.6829 | 0.6531 | 0.9254 | 0.5981 | 0.6936 | 0.6423 | 0.0833 | 0.1053 | 0.0930 | 0.9286 | 0.9455 | 0.9369 | 0.2083 | 0.2222 | 0.2151 | 0.5 | 0.0370 | 0.0690 | 0.7425 | 0.8023 | 0.7712 | 0.1905 | 0.2105 | 0.2 | 0.9760 | 0.9879 | 0.9819 | 0.3226 | 0.2222 | 0.2632 | 0.5 | 0.04 | 0.0741 | 0.6112 | 0.6976 | 0.6515 | 0.1429 | 0.1538 | 0.1481 | 0.9341 | 0.9455 | 0.9398 | 0.2368 | 0.2812 | 0.2571 | 0.0 | 0.0 | 0.0 | 0.4287 | 0.7145 |
|
109 |
+
| 0.2876 | 5.0 | 635 | 0.2609 | 0.6175 | 0.6854 | 0.6497 | 0.9255 | 0.5915 | 0.6936 | 0.6385 | 0.0851 | 0.2105 | 0.1212 | 0.9412 | 0.9697 | 0.9552 | 0.1481 | 0.0889 | 0.1111 | 0.5 | 0.1111 | 0.1818 | 0.7336 | 0.8138 | 0.7716 | 0.125 | 0.2105 | 0.1569 | 0.9760 | 0.9879 | 0.9819 | 0.2174 | 0.1111 | 0.1471 | 0.5 | 0.12 | 0.1935 | 0.6109 | 0.6932 | 0.6494 | 0.1860 | 0.6154 | 0.2857 | 0.9467 | 0.9697 | 0.9581 | 0.1 | 0.0312 | 0.0476 | 0.0 | 0.0 | 0.0 | 0.4192 | 0.7105 |
|
110 |
+
| 0.2876 | 6.0 | 762 | 0.2648 | 0.6192 | 0.6941 | 0.6545 | 0.9254 | 0.5938 | 0.7028 | 0.6437 | 0.1111 | 0.1579 | 0.1304 | 0.9290 | 0.9515 | 0.9401 | 0.2083 | 0.2222 | 0.2151 | 0.3333 | 0.1111 | 0.1667 | 0.7388 | 0.8196 | 0.7771 | 0.1579 | 0.1579 | 0.1579 | 0.9702 | 0.9879 | 0.9790 | 0.3077 | 0.2667 | 0.2857 | 0.5556 | 0.2 | 0.2941 | 0.6120 | 0.6998 | 0.6529 | 0.1304 | 0.2308 | 0.1667 | 0.9345 | 0.9515 | 0.9429 | 0.1389 | 0.1562 | 0.1471 | 0.0 | 0.0 | 0.0 | 0.4403 | 0.7187 |
|
111 |
+
| 0.2876 | 7.0 | 889 | 0.2722 | 0.6435 | 0.6941 | 0.6679 | 0.9280 | 0.6141 | 0.7009 | 0.6547 | 0.1364 | 0.1579 | 0.1463 | 0.9345 | 0.9515 | 0.9429 | 0.2326 | 0.2222 | 0.2273 | 0.4444 | 0.1481 | 0.2222 | 0.7567 | 0.8177 | 0.7860 | 0.1579 | 0.1579 | 0.1579 | 0.9760 | 0.9879 | 0.9819 | 0.2973 | 0.2444 | 0.2683 | 0.5 | 0.16 | 0.2424 | 0.6206 | 0.6932 | 0.6548 | 0.1875 | 0.2308 | 0.2069 | 0.9401 | 0.9515 | 0.9458 | 0.2059 | 0.2188 | 0.2121 | 1.0 | 0.1 | 0.1818 | 0.4638 | 0.7260 |
|
112 |
+
| 0.1075 | 8.0 | 1016 | 0.2843 | 0.6282 | 0.6941 | 0.6595 | 0.9253 | 0.5956 | 0.6917 | 0.6401 | 0.1364 | 0.1579 | 0.1463 | 0.9464 | 0.9636 | 0.9550 | 0.2245 | 0.2444 | 0.2340 | 0.4615 | 0.2222 | 0.3 | 0.7407 | 0.8061 | 0.7721 | 0.1667 | 0.1579 | 0.1622 | 0.9760 | 0.9879 | 0.9819 | 0.3158 | 0.2667 | 0.2892 | 0.5 | 0.24 | 0.3243 | 0.5950 | 0.6777 | 0.6336 | 0.1579 | 0.2308 | 0.1875 | 0.9578 | 0.9636 | 0.9607 | 0.1579 | 0.1875 | 0.1714 | 0.75 | 0.15 | 0.2500 | 0.4733 | 0.7181 |
|
113 |
+
| 0.1075 | 9.0 | 1143 | 0.2876 | 0.6353 | 0.6916 | 0.6623 | 0.9266 | 0.5968 | 0.6899 | 0.64 | 0.15 | 0.1579 | 0.1538 | 0.9464 | 0.9636 | 0.9550 | 0.2439 | 0.2222 | 0.2326 | 0.4615 | 0.2222 | 0.3 | 0.7381 | 0.8061 | 0.7706 | 0.1667 | 0.1579 | 0.1622 | 0.9760 | 0.9879 | 0.9819 | 0.3143 | 0.2444 | 0.2750 | 0.5 | 0.24 | 0.3243 | 0.6023 | 0.6821 | 0.6398 | 0.1875 | 0.2308 | 0.2069 | 0.9578 | 0.9636 | 0.9607 | 0.1875 | 0.1875 | 0.1875 | 1.0 | 0.2 | 0.3333 | 0.4842 | 0.7207 |
|
114 |
+
| 0.1075 | 10.0 | 1270 | 0.2903 | 0.6312 | 0.6966 | 0.6623 | 0.9274 | 0.6070 | 0.6972 | 0.6490 | 0.125 | 0.1579 | 0.1395 | 0.9464 | 0.9636 | 0.9550 | 0.1961 | 0.2222 | 0.2083 | 0.4 | 0.2222 | 0.2857 | 0.7540 | 0.8119 | 0.7819 | 0.1667 | 0.1579 | 0.1622 | 0.9760 | 0.9879 | 0.9819 | 0.275 | 0.2444 | 0.2588 | 0.5 | 0.24 | 0.3243 | 0.6175 | 0.7020 | 0.6570 | 0.15 | 0.2308 | 0.1818 | 0.9521 | 0.9636 | 0.9578 | 0.1622 | 0.1875 | 0.1739 | 0.5 | 0.15 | 0.2308 | 0.4710 | 0.7273 |
|
115 |
+
|
116 |
+
|
117 |
+
### Framework versions
|
118 |
+
|
119 |
+
- Transformers 4.35.2
|
120 |
+
- Pytorch 2.1.0+cu118
|
121 |
+
- Datasets 2.15.0
|
122 |
+
- Tokenizers 0.15.0
|