javicorvi commited on
Commit
ebffdbd
1 Parent(s): ac8081f

javicorvi/pretoxtm-ner

Browse files
README.md CHANGED
@@ -14,15 +14,15 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [dmis-lab/biobert-v1.1](https://huggingface.co/dmis-lab/biobert-v1.1) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.2356
18
- - Study Test: {'precision': 0.8107098381070984, 'recall': 0.9054242002781642, 'f1': 0.8554533508541393, 'number': 719}
19
- - Manifestation: {'precision': 0.8428571428571429, 'recall': 0.8966565349544073, 'f1': 0.8689248895434463, 'number': 329}
20
- - Finding: {'precision': 0.7924263674614306, 'recall': 0.7907627711686495, 'f1': 0.7915936952714535, 'number': 1429}
21
- - Specimen: {'precision': 0.7935064935064935, 'recall': 0.8427586206896551, 'f1': 0.817391304347826, 'number': 725}
22
- - Dose: {'precision': 0.8894472361809045, 'recall': 0.9315789473684211, 'f1': 0.910025706940874, 'number': 570}
23
- - Dose Qualification: {'precision': 0.75, 'recall': 0.7368421052631579, 'f1': 0.743362831858407, 'number': 57}
24
- - Sex: {'precision': 0.9282296650717703, 'recall': 0.9603960396039604, 'f1': 0.9440389294403893, 'number': 202}
25
- - Group: {'precision': 0.6992481203007519, 'recall': 0.8303571428571429, 'f1': 0.7591836734693878, 'number': 112}
26
 
27
  ## Model description
28
 
@@ -53,14 +53,14 @@ The following hyperparameters were used during training:
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Study Test | Manifestation | Finding | Specimen | Dose | Dose Qualification | Sex | Group |
55
  |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|
56
- | No log | 1.0 | 257 | 0.2315 | {'precision': 0.7803680981595092, 'recall': 0.8845618915159944, 'f1': 0.8292046936114733, 'number': 719} | {'precision': 0.8481375358166189, 'recall': 0.8996960486322189, 'f1': 0.8731563421828908, 'number': 329} | {'precision': 0.8012618296529969, 'recall': 0.7109867039888034, 'f1': 0.7534297367445311, 'number': 1429} | {'precision': 0.7451456310679612, 'recall': 0.8468965517241379, 'f1': 0.7927695287282118, 'number': 725} | {'precision': 0.9078498293515358, 'recall': 0.9333333333333333, 'f1': 0.9204152249134948, 'number': 570} | {'precision': 0.703125, 'recall': 0.7894736842105263, 'f1': 0.743801652892562, 'number': 57} | {'precision': 0.9241706161137441, 'recall': 0.9653465346534653, 'f1': 0.9443099273607748, 'number': 202} | {'precision': 0.625, 'recall': 0.8928571428571429, 'f1': 0.7352941176470589, 'number': 112} |
57
- | 0.0608 | 2.0 | 514 | 0.2285 | {'precision': 0.803680981595092, 'recall': 0.9109874826147427, 'f1': 0.8539765319426338, 'number': 719} | {'precision': 0.846820809248555, 'recall': 0.8905775075987842, 'f1': 0.8681481481481481, 'number': 329} | {'precision': 0.7780040733197556, 'recall': 0.8019594121763471, 'f1': 0.7898001378359754, 'number': 1429} | {'precision': 0.8052631578947368, 'recall': 0.8441379310344828, 'f1': 0.8242424242424241, 'number': 725} | {'precision': 0.9047619047619048, 'recall': 0.9333333333333333, 'f1': 0.9188255613126078, 'number': 570} | {'precision': 0.7368421052631579, 'recall': 0.7368421052631579, 'f1': 0.7368421052631579, 'number': 57} | {'precision': 0.9289099526066351, 'recall': 0.9702970297029703, 'f1': 0.9491525423728814, 'number': 202} | {'precision': 0.6783216783216783, 'recall': 0.8660714285714286, 'f1': 0.7607843137254903, 'number': 112} |
58
- | 0.0608 | 3.0 | 771 | 0.2356 | {'precision': 0.8107098381070984, 'recall': 0.9054242002781642, 'f1': 0.8554533508541393, 'number': 719} | {'precision': 0.8428571428571429, 'recall': 0.8966565349544073, 'f1': 0.8689248895434463, 'number': 329} | {'precision': 0.7924263674614306, 'recall': 0.7907627711686495, 'f1': 0.7915936952714535, 'number': 1429} | {'precision': 0.7935064935064935, 'recall': 0.8427586206896551, 'f1': 0.817391304347826, 'number': 725} | {'precision': 0.8894472361809045, 'recall': 0.9315789473684211, 'f1': 0.910025706940874, 'number': 570} | {'precision': 0.75, 'recall': 0.7368421052631579, 'f1': 0.743362831858407, 'number': 57} | {'precision': 0.9282296650717703, 'recall': 0.9603960396039604, 'f1': 0.9440389294403893, 'number': 202} | {'precision': 0.6992481203007519, 'recall': 0.8303571428571429, 'f1': 0.7591836734693878, 'number': 112} |
59
 
60
 
61
  ### Framework versions
62
 
63
- - Transformers 4.33.3
64
- - Pytorch 2.0.1+cu118
65
- - Datasets 2.14.5
66
- - Tokenizers 0.13.3
 
14
 
15
  This model is a fine-tuned version of [dmis-lab/biobert-v1.1](https://huggingface.co/dmis-lab/biobert-v1.1) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.1892
18
+ - Study Test: {'precision': 0.7680288461538461, 'recall': 0.8887343532684284, 'f1': 0.8239845261121858, 'number': 719}
19
+ - Manifestation: {'precision': 0.8245125348189415, 'recall': 0.8996960486322189, 'f1': 0.8604651162790697, 'number': 329}
20
+ - Finding: {'precision': 0.7699175824175825, 'recall': 0.7844646606018194, 'f1': 0.7771230502599653, 'number': 1429}
21
+ - Specimen: {'precision': 0.8063660477453581, 'recall': 0.8386206896551724, 'f1': 0.8221771467207573, 'number': 725}
22
+ - Dose: {'precision': 0.8887043189368771, 'recall': 0.9385964912280702, 'f1': 0.9129692832764505, 'number': 570}
23
+ - Dose Qualification: {'precision': 0.7419354838709677, 'recall': 0.8070175438596491, 'f1': 0.7731092436974789, 'number': 57}
24
+ - Sex: {'precision': 0.9405940594059405, 'recall': 0.9405940594059405, 'f1': 0.9405940594059405, 'number': 202}
25
+ - Group: {'precision': 0.647887323943662, 'recall': 0.8214285714285714, 'f1': 0.7244094488188976, 'number': 112}
26
 
27
  ## Model description
28
 
 
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Study Test | Manifestation | Finding | Specimen | Dose | Dose Qualification | Sex | Group |
55
  |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|
56
+ | No log | 1.0 | 257 | 0.2128 | {'precision': 0.7071759259259259, 'recall': 0.8497913769123783, 'f1': 0.7719519898926089, 'number': 719} | {'precision': 0.7883597883597884, 'recall': 0.9057750759878419, 'f1': 0.842998585572843, 'number': 329} | {'precision': 0.7293080054274084, 'recall': 0.7522743177046886, 'f1': 0.74061315880124, 'number': 1429} | {'precision': 0.7443609022556391, 'recall': 0.8193103448275862, 'f1': 0.7800393959290873, 'number': 725} | {'precision': 0.7741433021806854, 'recall': 0.8719298245614036, 'f1': 0.8201320132013201, 'number': 570} | {'precision': 0.7272727272727273, 'recall': 0.5614035087719298, 'f1': 0.6336633663366337, 'number': 57} | {'precision': 0.9215686274509803, 'recall': 0.9306930693069307, 'f1': 0.9261083743842364, 'number': 202} | {'precision': 0.5673758865248227, 'recall': 0.7142857142857143, 'f1': 0.6324110671936759, 'number': 112} |
57
+ | 0.2683 | 2.0 | 514 | 0.1918 | {'precision': 0.7720144752714113, 'recall': 0.8901251738525731, 'f1': 0.82687338501292, 'number': 719} | {'precision': 0.803763440860215, 'recall': 0.9088145896656535, 'f1': 0.8530670470756062, 'number': 329} | {'precision': 0.7732474964234621, 'recall': 0.7564730580825753, 'f1': 0.7647683056243367, 'number': 1429} | {'precision': 0.7907894736842105, 'recall': 0.8289655172413793, 'f1': 0.8094276094276094, 'number': 725} | {'precision': 0.8778877887788779, 'recall': 0.9333333333333333, 'f1': 0.9047619047619047, 'number': 570} | {'precision': 0.746031746031746, 'recall': 0.8245614035087719, 'f1': 0.7833333333333334, 'number': 57} | {'precision': 0.945, 'recall': 0.9356435643564357, 'f1': 0.9402985074626865, 'number': 202} | {'precision': 0.6298701298701299, 'recall': 0.8660714285714286, 'f1': 0.7293233082706767, 'number': 112} |
58
+ | 0.2683 | 3.0 | 771 | 0.1892 | {'precision': 0.7680288461538461, 'recall': 0.8887343532684284, 'f1': 0.8239845261121858, 'number': 719} | {'precision': 0.8245125348189415, 'recall': 0.8996960486322189, 'f1': 0.8604651162790697, 'number': 329} | {'precision': 0.7699175824175825, 'recall': 0.7844646606018194, 'f1': 0.7771230502599653, 'number': 1429} | {'precision': 0.8063660477453581, 'recall': 0.8386206896551724, 'f1': 0.8221771467207573, 'number': 725} | {'precision': 0.8887043189368771, 'recall': 0.9385964912280702, 'f1': 0.9129692832764505, 'number': 570} | {'precision': 0.7419354838709677, 'recall': 0.8070175438596491, 'f1': 0.7731092436974789, 'number': 57} | {'precision': 0.9405940594059405, 'recall': 0.9405940594059405, 'f1': 0.9405940594059405, 'number': 202} | {'precision': 0.647887323943662, 'recall': 0.8214285714285714, 'f1': 0.7244094488188976, 'number': 112} |
59
 
60
 
61
  ### Framework versions
62
 
63
+ - Transformers 4.35.2
64
+ - Pytorch 2.1.0+cu118
65
+ - Datasets 2.15.0
66
+ - Tokenizers 0.15.0
config.json CHANGED
@@ -57,7 +57,7 @@
57
  "pad_token_id": 0,
58
  "position_embedding_type": "absolute",
59
  "torch_dtype": "float32",
60
- "transformers_version": "4.33.3",
61
  "type_vocab_size": 2,
62
  "use_cache": true,
63
  "vocab_size": 28996
 
57
  "pad_token_id": 0,
58
  "position_embedding_type": "absolute",
59
  "torch_dtype": "float32",
60
+ "transformers_version": "4.35.2",
61
  "type_vocab_size": 2,
62
  "use_cache": true,
63
  "vocab_size": 28996
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:66fddf6e6b9566751e924cea133c1eb69bfbeb4b592fdd68c4ce7a12c45c6922
3
+ size 430954348
runs/Dec14_12-04-36_a2e3dc378379/events.out.tfevents.1702555486.a2e3dc378379.314.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a89f5185f988d6434a4e5f262739a840e2fee821d795a6f8e81d17590dd52bdf
3
+ size 6351
tokenizer_config.json CHANGED
@@ -1,4 +1,46 @@
1
  {
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  "clean_up_tokenization_spaces": true,
3
  "cls_token": "[CLS]",
4
  "do_basic_tokenize": true,
 
1
  {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
  "clean_up_tokenization_spaces": true,
45
  "cls_token": "[CLS]",
46
  "do_basic_tokenize": true,
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0107f0f369143ef3e6969cd01fdcbfc6dc3da1db43f8dac2c4b0c57b869e6932
3
- size 4027
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:64aad2cb95f982a10ed814caf6eaa6ed663766f0862118816a8410cab546e991
3
+ size 4536