uttam333 commited on
Commit
3cb948e
1 Parent(s): 71b89a2

End of training

Browse files
README.md CHANGED
@@ -15,13 +15,13 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.1118
19
- - Noise: {'precision': 0.8832116788321168, 'recall': 0.8832116788321168, 'f1': 0.8832116788321168, 'number': 548}
20
- - Signal: {'precision': 0.8594890510948905, 'recall': 0.8594890510948905, 'f1': 0.8594890510948904, 'number': 548}
21
- - Overall Precision: 0.8714
22
- - Overall Recall: 0.8714
23
- - Overall F1: 0.8714
24
- - Overall Accuracy: 0.9773
25
 
26
  ## Model description
27
 
@@ -51,23 +51,23 @@ The following hyperparameters were used during training:
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss | Noise | Signal | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
55
- |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
56
- | 0.4739 | 1.0 | 18 | 0.1915 | {'precision': 0.6647398843930635, 'recall': 0.6295620437956204, 'f1': 0.6466729147141518, 'number': 548} | {'precision': 0.6782273603082851, 'recall': 0.6423357664233577, 'f1': 0.6597938144329897, 'number': 548} | 0.6715 | 0.6359 | 0.6532 | 0.9293 |
57
- | 0.188 | 2.0 | 36 | 0.1127 | {'precision': 0.8265107212475633, 'recall': 0.7737226277372263, 'f1': 0.7992459943449576, 'number': 548} | {'precision': 0.7953216374269005, 'recall': 0.7445255474452555, 'f1': 0.769085768143261, 'number': 548} | 0.8109 | 0.7591 | 0.7842 | 0.9579 |
58
- | 0.1052 | 3.0 | 54 | 0.0889 | {'precision': 0.8455743879472694, 'recall': 0.8193430656934306, 'f1': 0.8322520852641334, 'number': 548} | {'precision': 0.8248587570621468, 'recall': 0.7992700729927007, 'f1': 0.8118628359592215, 'number': 548} | 0.8352 | 0.8093 | 0.8221 | 0.9674 |
59
- | 0.0645 | 4.0 | 72 | 0.0766 | {'precision': 0.8775510204081632, 'recall': 0.8631386861313869, 'f1': 0.8702851885924563, 'number': 548} | {'precision': 0.8552875695732839, 'recall': 0.8412408759124088, 'f1': 0.8482060717571298, 'number': 548} | 0.8664 | 0.8522 | 0.8592 | 0.9750 |
60
- | 0.0427 | 5.0 | 90 | 0.0914 | {'precision': 0.8586956521739131, 'recall': 0.864963503649635, 'f1': 0.8618181818181818, 'number': 548} | {'precision': 0.8351449275362319, 'recall': 0.8412408759124088, 'f1': 0.8381818181818181, 'number': 548} | 0.8469 | 0.8531 | 0.8500 | 0.9730 |
61
- | 0.0283 | 6.0 | 108 | 0.0987 | {'precision': 0.8756855575868373, 'recall': 0.8740875912408759, 'f1': 0.8748858447488584, 'number': 548} | {'precision': 0.8555758683729433, 'recall': 0.8540145985401459, 'f1': 0.8547945205479452, 'number': 548} | 0.8656 | 0.8641 | 0.8648 | 0.9761 |
62
- | 0.0205 | 7.0 | 126 | 0.0988 | {'precision': 0.8646209386281588, 'recall': 0.8740875912408759, 'f1': 0.8693284936479129, 'number': 548} | {'precision': 0.8375451263537906, 'recall': 0.8467153284671532, 'f1': 0.8421052631578947, 'number': 548} | 0.8511 | 0.8604 | 0.8557 | 0.9742 |
63
- | 0.0141 | 8.0 | 144 | 0.1086 | {'precision': 0.8706739526411658, 'recall': 0.8722627737226277, 'f1': 0.8714676390154968, 'number': 548} | {'precision': 0.8542805100182149, 'recall': 0.8558394160583942, 'f1': 0.8550592525068369, 'number': 548} | 0.8625 | 0.8641 | 0.8633 | 0.9753 |
64
- | 0.012 | 9.0 | 162 | 0.1076 | {'precision': 0.8811700182815356, 'recall': 0.8795620437956204, 'f1': 0.8803652968036529, 'number': 548} | {'precision': 0.8592321755027422, 'recall': 0.8576642335766423, 'f1': 0.8584474885844748, 'number': 548} | 0.8702 | 0.8686 | 0.8694 | 0.9773 |
65
- | 0.0104 | 10.0 | 180 | 0.1089 | {'precision': 0.8788990825688073, 'recall': 0.8740875912408759, 'f1': 0.8764867337602928, 'number': 548} | {'precision': 0.8568807339449541, 'recall': 0.8521897810218978, 'f1': 0.8545288197621226, 'number': 548} | 0.8679 | 0.8631 | 0.8655 | 0.9764 |
66
- | 0.0101 | 11.0 | 198 | 0.1111 | {'precision': 0.8813868613138686, 'recall': 0.8813868613138686, 'f1': 0.8813868613138687, 'number': 548} | {'precision': 0.8594890510948905, 'recall': 0.8594890510948905, 'f1': 0.8594890510948904, 'number': 548} | 0.8704 | 0.8704 | 0.8704 | 0.9761 |
67
- | 0.008 | 12.0 | 216 | 0.1049 | {'precision': 0.886654478976234, 'recall': 0.885036496350365, 'f1': 0.8858447488584474, 'number': 548} | {'precision': 0.8665447897623401, 'recall': 0.864963503649635, 'f1': 0.8657534246575344, 'number': 548} | 0.8766 | 0.875 | 0.8758 | 0.9778 |
68
- | 0.0072 | 13.0 | 234 | 0.1094 | {'precision': 0.8775137111517367, 'recall': 0.8759124087591241, 'f1': 0.8767123287671232, 'number': 548} | {'precision': 0.8519195612431444, 'recall': 0.8503649635036497, 'f1': 0.8511415525114155, 'number': 548} | 0.8647 | 0.8631 | 0.8639 | 0.9759 |
69
- | 0.007 | 14.0 | 252 | 0.1117 | {'precision': 0.8777372262773723, 'recall': 0.8777372262773723, 'f1': 0.8777372262773723, 'number': 548} | {'precision': 0.8540145985401459, 'recall': 0.8540145985401459, 'f1': 0.8540145985401459, 'number': 548} | 0.8659 | 0.8659 | 0.8659 | 0.9764 |
70
- | 0.0084 | 15.0 | 270 | 0.1118 | {'precision': 0.8832116788321168, 'recall': 0.8832116788321168, 'f1': 0.8832116788321168, 'number': 548} | {'precision': 0.8594890510948905, 'recall': 0.8594890510948905, 'f1': 0.8594890510948904, 'number': 548} | 0.8714 | 0.8714 | 0.8714 | 0.9773 |
71
 
72
 
73
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.3594
19
+ - Noise: {'precision': 0.5976627712854758, 'recall': 0.5700636942675159, 'f1': 0.5835370823145885, 'number': 628}
20
+ - Signal: {'precision': 0.5559265442404007, 'recall': 0.5302547770700637, 'f1': 0.5427872860635697, 'number': 628}
21
+ - Overall Precision: 0.5768
22
+ - Overall Recall: 0.5502
23
+ - Overall F1: 0.5632
24
+ - Overall Accuracy: 0.8777
25
 
26
  ## Model description
27
 
 
51
 
52
  ### Training results
53
 
54
+ | Training Loss | Epoch | Step | Validation Loss | Noise | Signal | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
55
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
56
+ | 0.5172 | 1.0 | 18 | 0.4915 | {'precision': 0.3973288814691152, 'recall': 0.37898089171974525, 'f1': 0.3879380603096985, 'number': 628} | {'precision': 0.36894824707846413, 'recall': 0.3519108280254777, 'f1': 0.3602281988590057, 'number': 628} | 0.3831 | 0.3654 | 0.3741 | 0.7779 |
57
+ | 0.4057 | 2.0 | 36 | 0.4306 | {'precision': 0.42788461538461536, 'recall': 0.4251592356687898, 'f1': 0.426517571884984, 'number': 628} | {'precision': 0.3766025641025641, 'recall': 0.37420382165605093, 'f1': 0.3753993610223642, 'number': 628} | 0.4022 | 0.3997 | 0.4010 | 0.8151 |
58
+ | 0.3616 | 3.0 | 54 | 0.4145 | {'precision': 0.444633730834753, 'recall': 0.4156050955414013, 'f1': 0.4296296296296297, 'number': 628} | {'precision': 0.41056218057921634, 'recall': 0.3837579617834395, 'f1': 0.3967078189300412, 'number': 628} | 0.4276 | 0.3997 | 0.4132 | 0.8237 |
59
+ | 0.3278 | 4.0 | 72 | 0.3994 | {'precision': 0.5050167224080268, 'recall': 0.48089171974522293, 'f1': 0.4926590538336052, 'number': 628} | {'precision': 0.4698996655518395, 'recall': 0.44745222929936307, 'f1': 0.4584013050570963, 'number': 628} | 0.4875 | 0.4642 | 0.4755 | 0.8366 |
60
+ | 0.2966 | 5.0 | 90 | 0.3795 | {'precision': 0.5129533678756477, 'recall': 0.4729299363057325, 'f1': 0.49212924606462305, 'number': 628} | {'precision': 0.5043177892918825, 'recall': 0.46496815286624205, 'f1': 0.48384424192212094, 'number': 628} | 0.5086 | 0.4689 | 0.4880 | 0.8489 |
61
+ | 0.2717 | 6.0 | 108 | 0.3526 | {'precision': 0.5459272097053726, 'recall': 0.5015923566878981, 'f1': 0.5228215767634854, 'number': 628} | {'precision': 0.511265164644714, 'recall': 0.4697452229299363, 'f1': 0.48962655601659755, 'number': 628} | 0.5286 | 0.4857 | 0.5062 | 0.8581 |
62
+ | 0.2441 | 7.0 | 126 | 0.3400 | {'precision': 0.5338208409506399, 'recall': 0.46496815286624205, 'f1': 0.4970212765957447, 'number': 628} | {'precision': 0.49725776965265084, 'recall': 0.43312101910828027, 'f1': 0.4629787234042553, 'number': 628} | 0.5155 | 0.4490 | 0.4800 | 0.8598 |
63
+ | 0.224 | 8.0 | 144 | 0.3324 | {'precision': 0.563922942206655, 'recall': 0.5127388535031847, 'f1': 0.5371142618849041, 'number': 628} | {'precision': 0.5288966725043783, 'recall': 0.48089171974522293, 'f1': 0.5037531276063387, 'number': 628} | 0.5464 | 0.4968 | 0.5204 | 0.8673 |
64
+ | 0.2044 | 9.0 | 162 | 0.3249 | {'precision': 0.5833333333333334, 'recall': 0.535031847133758, 'f1': 0.558139534883721, 'number': 628} | {'precision': 0.5347222222222222, 'recall': 0.49044585987261147, 'f1': 0.5116279069767441, 'number': 628} | 0.5590 | 0.5127 | 0.5349 | 0.8726 |
65
+ | 0.1914 | 10.0 | 180 | 0.3481 | {'precision': 0.5597920277296361, 'recall': 0.5143312101910829, 'f1': 0.5360995850622408, 'number': 628} | {'precision': 0.511265164644714, 'recall': 0.4697452229299363, 'f1': 0.48962655601659755, 'number': 628} | 0.5355 | 0.4920 | 0.5129 | 0.8645 |
66
+ | 0.1823 | 11.0 | 198 | 0.3412 | {'precision': 0.5963756177924218, 'recall': 0.5764331210191083, 'f1': 0.5862348178137652, 'number': 628} | {'precision': 0.5667215815485996, 'recall': 0.5477707006369427, 'f1': 0.557085020242915, 'number': 628} | 0.5815 | 0.5621 | 0.5717 | 0.8810 |
67
+ | 0.1672 | 12.0 | 216 | 0.3496 | {'precision': 0.5791245791245792, 'recall': 0.5477707006369427, 'f1': 0.563011456628478, 'number': 628} | {'precision': 0.5420875420875421, 'recall': 0.5127388535031847, 'f1': 0.5270049099836335, 'number': 628} | 0.5606 | 0.5303 | 0.5450 | 0.8735 |
68
+ | 0.1627 | 13.0 | 234 | 0.3675 | {'precision': 0.5953565505804311, 'recall': 0.571656050955414, 'f1': 0.5832656376929325, 'number': 628} | {'precision': 0.5621890547263682, 'recall': 0.5398089171974523, 'f1': 0.5507717303005686, 'number': 628} | 0.5788 | 0.5557 | 0.5670 | 0.8779 |
69
+ | 0.1592 | 14.0 | 252 | 0.3562 | {'precision': 0.596694214876033, 'recall': 0.5748407643312102, 'f1': 0.5855636658556367, 'number': 628} | {'precision': 0.5504132231404959, 'recall': 0.5302547770700637, 'f1': 0.5401459854014599, 'number': 628} | 0.5736 | 0.5525 | 0.5629 | 0.8757 |
70
+ | 0.1553 | 15.0 | 270 | 0.3594 | {'precision': 0.5976627712854758, 'recall': 0.5700636942675159, 'f1': 0.5835370823145885, 'number': 628} | {'precision': 0.5559265442404007, 'recall': 0.5302547770700637, 'f1': 0.5427872860635697, 'number': 628} | 0.5768 | 0.5502 | 0.5632 | 0.8777 |
71
 
72
 
73
  ### Framework versions
logs/events.out.tfevents.1704879167.c344c92e3e43.326.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:40fdfffab371f243723e7809874f9f65be189ff4443960588abbfd0b5828181a
3
- size 13649
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8e35646fd1c35481ce4d27394623857c3efe852e705a7ec32b744df7683e1c50
3
+ size 14664
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d7474482eabc9fd0e7422601988ea3571a4ab5aec8a026516268a73330553d22
3
  size 450542824
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae8c18f7820cf9ac24702470c86b85b5f3492d2547575cf609b7e6d405b969cf
3
  size 450542824