avurity commited on
Commit
39ae538
1 Parent(s): e8301e6

End of training

Browse files
Files changed (2) hide show
  1. README.md +50 -50
  2. pytorch_model.bin +1 -1
README.md CHANGED
@@ -25,16 +25,16 @@ model-index:
25
  metrics:
26
  - name: Precision
27
  type: precision
28
- value: 0.8791872597473915
29
  - name: Recall
30
  type: recall
31
- value: 0.8814865794907089
32
  - name: F1
33
  type: f1
34
- value: 0.8803354182418035
35
  - name: Accuracy
36
  type: accuracy
37
- value: 0.9270261366132221
38
  ---
39
 
40
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -44,11 +44,11 @@ should probably proofread and complete it, then remove this comment. -->
44
 
45
  This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the wildreceipt dataset.
46
  It achieves the following results on the evaluation set:
47
- - Loss: 0.3081
48
- - Precision: 0.8792
49
- - Recall: 0.8815
50
- - F1: 0.8803
51
- - Accuracy: 0.9270
52
 
53
  ## Model description
54
 
@@ -79,51 +79,51 @@ The following hyperparameters were used during training:
79
 
80
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
81
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
82
- | No log | 0.32 | 100 | 1.3430 | 0.5041 | 0.1959 | 0.2821 | 0.6414 |
83
- | No log | 0.63 | 200 | 0.8931 | 0.6739 | 0.5367 | 0.5975 | 0.7786 |
84
- | No log | 0.95 | 300 | 0.6793 | 0.7332 | 0.6410 | 0.6840 | 0.8273 |
85
- | No log | 1.26 | 400 | 0.5804 | 0.7659 | 0.7090 | 0.7364 | 0.8507 |
86
- | 1.0357 | 1.58 | 500 | 0.4876 | 0.7919 | 0.7551 | 0.7731 | 0.8723 |
87
- | 1.0357 | 1.89 | 600 | 0.4417 | 0.8009 | 0.7997 | 0.8003 | 0.8857 |
88
- | 1.0357 | 2.21 | 700 | 0.3937 | 0.8256 | 0.8200 | 0.8228 | 0.8973 |
89
- | 1.0357 | 2.52 | 800 | 0.3904 | 0.8143 | 0.8321 | 0.8231 | 0.8958 |
90
- | 1.0357 | 2.84 | 900 | 0.3638 | 0.8462 | 0.8211 | 0.8334 | 0.9010 |
91
- | 0.3989 | 3.15 | 1000 | 0.3586 | 0.8386 | 0.8447 | 0.8417 | 0.9055 |
92
- | 0.3989 | 3.47 | 1100 | 0.3227 | 0.8382 | 0.8564 | 0.8472 | 0.9104 |
93
- | 0.3989 | 3.79 | 1200 | 0.3120 | 0.8538 | 0.8522 | 0.8530 | 0.9119 |
94
- | 0.3989 | 4.1 | 1300 | 0.3283 | 0.8498 | 0.8559 | 0.8528 | 0.9117 |
95
- | 0.3989 | 4.42 | 1400 | 0.3084 | 0.8595 | 0.8606 | 0.8600 | 0.9165 |
96
- | 0.2727 | 4.73 | 1500 | 0.3026 | 0.8552 | 0.8666 | 0.8609 | 0.9159 |
97
- | 0.2727 | 5.05 | 1600 | 0.3052 | 0.8633 | 0.8537 | 0.8585 | 0.9165 |
98
- | 0.2727 | 5.36 | 1700 | 0.3052 | 0.8505 | 0.8747 | 0.8625 | 0.9165 |
99
- | 0.2727 | 5.68 | 1800 | 0.3040 | 0.8579 | 0.8690 | 0.8634 | 0.9164 |
100
- | 0.2727 | 5.99 | 1900 | 0.2926 | 0.8717 | 0.8696 | 0.8707 | 0.9205 |
101
- | 0.2059 | 6.31 | 2000 | 0.3004 | 0.8646 | 0.8753 | 0.8699 | 0.9207 |
102
- | 0.2059 | 6.62 | 2100 | 0.2973 | 0.8711 | 0.8742 | 0.8726 | 0.9215 |
103
- | 0.2059 | 6.94 | 2200 | 0.3010 | 0.8650 | 0.8761 | 0.8705 | 0.9214 |
104
- | 0.2059 | 7.26 | 2300 | 0.3028 | 0.8654 | 0.8760 | 0.8706 | 0.9214 |
105
- | 0.2059 | 7.57 | 2400 | 0.2956 | 0.8769 | 0.8769 | 0.8769 | 0.9260 |
106
- | 0.1617 | 7.89 | 2500 | 0.2871 | 0.8746 | 0.8778 | 0.8762 | 0.9266 |
107
- | 0.1617 | 8.2 | 2600 | 0.3092 | 0.8632 | 0.8797 | 0.8714 | 0.9226 |
108
- | 0.1617 | 8.52 | 2700 | 0.3042 | 0.8834 | 0.8738 | 0.8786 | 0.9265 |
109
- | 0.1617 | 8.83 | 2800 | 0.3092 | 0.8672 | 0.8793 | 0.8732 | 0.9224 |
110
- | 0.1617 | 9.15 | 2900 | 0.3014 | 0.8738 | 0.8841 | 0.8789 | 0.9256 |
111
- | 0.1359 | 9.46 | 3000 | 0.3038 | 0.8763 | 0.8760 | 0.8762 | 0.9249 |
112
- | 0.1359 | 9.78 | 3100 | 0.3087 | 0.8730 | 0.8797 | 0.8763 | 0.9241 |
113
- | 0.1359 | 10.09 | 3200 | 0.3021 | 0.8740 | 0.8812 | 0.8776 | 0.9251 |
114
- | 0.1359 | 10.41 | 3300 | 0.2975 | 0.8790 | 0.8836 | 0.8812 | 0.9268 |
115
- | 0.1359 | 10.73 | 3400 | 0.3121 | 0.8734 | 0.8809 | 0.8771 | 0.9254 |
116
- | 0.1192 | 11.04 | 3500 | 0.3111 | 0.8812 | 0.8794 | 0.8803 | 0.9260 |
117
- | 0.1192 | 11.36 | 3600 | 0.3101 | 0.8785 | 0.8790 | 0.8788 | 0.9261 |
118
- | 0.1192 | 11.67 | 3700 | 0.3082 | 0.8790 | 0.8829 | 0.8809 | 0.9275 |
119
- | 0.1192 | 11.99 | 3800 | 0.3081 | 0.8822 | 0.8830 | 0.8826 | 0.9276 |
120
- | 0.1192 | 12.3 | 3900 | 0.3100 | 0.8800 | 0.8809 | 0.8805 | 0.9269 |
121
- | 0.1065 | 12.62 | 4000 | 0.3081 | 0.8792 | 0.8815 | 0.8803 | 0.9270 |
122
 
123
 
124
  ### Framework versions
125
 
126
  - Transformers 4.32.0.dev0
127
- - Pytorch 2.0.1+cu118
128
  - Datasets 2.14.3
129
  - Tokenizers 0.13.3
 
25
  metrics:
26
  - name: Precision
27
  type: precision
28
+ value: 0.8738394320043692
29
  - name: Recall
30
  type: recall
31
+ value: 0.88093599449415
32
  - name: F1
33
  type: f1
34
+ value: 0.8773733634930428
35
  - name: Accuracy
36
  type: accuracy
37
+ value: 0.9245552383044147
38
  ---
39
 
40
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
44
 
45
  This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the wildreceipt dataset.
46
  It achieves the following results on the evaluation set:
47
+ - Loss: 0.3068
48
+ - Precision: 0.8738
49
+ - Recall: 0.8809
50
+ - F1: 0.8774
51
+ - Accuracy: 0.9246
52
 
53
  ## Model description
54
 
 
79
 
80
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
81
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
82
+ | No log | 0.32 | 100 | 1.3498 | 0.6130 | 0.3126 | 0.4140 | 0.6742 |
83
+ | No log | 0.63 | 200 | 0.8939 | 0.6665 | 0.5317 | 0.5915 | 0.7815 |
84
+ | No log | 0.95 | 300 | 0.7159 | 0.7311 | 0.6425 | 0.6840 | 0.8161 |
85
+ | No log | 1.26 | 400 | 0.5901 | 0.7554 | 0.6690 | 0.7095 | 0.8405 |
86
+ | 1.0677 | 1.58 | 500 | 0.5263 | 0.7632 | 0.7232 | 0.7427 | 0.8578 |
87
+ | 1.0677 | 1.89 | 600 | 0.4759 | 0.7871 | 0.7777 | 0.7824 | 0.8774 |
88
+ | 1.0677 | 2.21 | 700 | 0.4299 | 0.8054 | 0.8070 | 0.8062 | 0.8890 |
89
+ | 1.0677 | 2.52 | 800 | 0.4165 | 0.8064 | 0.8311 | 0.8185 | 0.8937 |
90
+ | 1.0677 | 2.84 | 900 | 0.3845 | 0.8344 | 0.8300 | 0.8322 | 0.9005 |
91
+ | 0.4267 | 3.15 | 1000 | 0.3540 | 0.8433 | 0.8318 | 0.8375 | 0.9056 |
92
+ | 0.4267 | 3.47 | 1100 | 0.3429 | 0.8362 | 0.8540 | 0.8450 | 0.9086 |
93
+ | 0.4267 | 3.79 | 1200 | 0.3274 | 0.8451 | 0.8545 | 0.8498 | 0.9105 |
94
+ | 0.4267 | 4.1 | 1300 | 0.3433 | 0.8397 | 0.8535 | 0.8466 | 0.9092 |
95
+ | 0.4267 | 4.42 | 1400 | 0.3181 | 0.8514 | 0.8604 | 0.8559 | 0.9154 |
96
+ | 0.2869 | 4.73 | 1500 | 0.3191 | 0.8472 | 0.8637 | 0.8554 | 0.9129 |
97
+ | 0.2869 | 5.05 | 1600 | 0.3128 | 0.8613 | 0.8658 | 0.8635 | 0.9182 |
98
+ | 0.2869 | 5.36 | 1700 | 0.3121 | 0.8622 | 0.8695 | 0.8658 | 0.9182 |
99
+ | 0.2869 | 5.68 | 1800 | 0.3230 | 0.8473 | 0.8661 | 0.8566 | 0.9140 |
100
+ | 0.2869 | 5.99 | 1900 | 0.2986 | 0.8729 | 0.8633 | 0.8681 | 0.9209 |
101
+ | 0.2134 | 6.31 | 2000 | 0.3032 | 0.8555 | 0.8694 | 0.8624 | 0.9169 |
102
+ | 0.2134 | 6.62 | 2100 | 0.3056 | 0.8705 | 0.8710 | 0.8708 | 0.9220 |
103
+ | 0.2134 | 6.94 | 2200 | 0.3122 | 0.8630 | 0.8790 | 0.8709 | 0.9217 |
104
+ | 0.2134 | 7.26 | 2300 | 0.3047 | 0.8692 | 0.8778 | 0.8734 | 0.9215 |
105
+ | 0.2134 | 7.57 | 2400 | 0.3103 | 0.8701 | 0.8780 | 0.8741 | 0.9225 |
106
+ | 0.1661 | 7.89 | 2500 | 0.3080 | 0.8712 | 0.8787 | 0.8749 | 0.9226 |
107
+ | 0.1661 | 8.2 | 2600 | 0.3011 | 0.8653 | 0.8834 | 0.8743 | 0.9236 |
108
+ | 0.1661 | 8.52 | 2700 | 0.3034 | 0.8735 | 0.8798 | 0.8766 | 0.9247 |
109
+ | 0.1661 | 8.83 | 2800 | 0.3054 | 0.8698 | 0.8793 | 0.8745 | 0.9238 |
110
+ | 0.1661 | 9.15 | 2900 | 0.3105 | 0.8697 | 0.8812 | 0.8754 | 0.9237 |
111
+ | 0.1415 | 9.46 | 3000 | 0.3068 | 0.8738 | 0.8809 | 0.8774 | 0.9246 |
112
+ | 0.1415 | 9.78 | 3100 | 0.3086 | 0.8730 | 0.8793 | 0.8761 | 0.9229 |
113
+ | 0.1415 | 10.09 | 3200 | 0.3013 | 0.8755 | 0.8830 | 0.8792 | 0.9256 |
114
+ | 0.1415 | 10.41 | 3300 | 0.3107 | 0.8692 | 0.8815 | 0.8753 | 0.9241 |
115
+ | 0.1415 | 10.73 | 3400 | 0.3073 | 0.8759 | 0.8794 | 0.8777 | 0.9261 |
116
+ | 0.1239 | 11.04 | 3500 | 0.3109 | 0.8727 | 0.8819 | 0.8773 | 0.9253 |
117
+ | 0.1239 | 11.36 | 3600 | 0.3124 | 0.8723 | 0.8790 | 0.8756 | 0.9243 |
118
+ | 0.1239 | 11.67 | 3700 | 0.3171 | 0.8724 | 0.8805 | 0.8764 | 0.9241 |
119
+ | 0.1239 | 11.99 | 3800 | 0.3081 | 0.8739 | 0.8804 | 0.8771 | 0.9254 |
120
+ | 0.1239 | 12.3 | 3900 | 0.3095 | 0.8735 | 0.8798 | 0.8766 | 0.9254 |
121
+ | 0.1106 | 12.62 | 4000 | 0.3094 | 0.8740 | 0.8796 | 0.8768 | 0.9254 |
122
 
123
 
124
  ### Framework versions
125
 
126
  - Transformers 4.32.0.dev0
127
+ - Pytorch 2.0.0
128
  - Datasets 2.14.3
129
  - Tokenizers 0.13.3
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9810ed3f441e2b746bbce28ef9ef4faea53f66211414fb8b813e71a880b5c149
3
  size 503824749
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ce09d6cdc6989a8d8fbde4c6052430b868ec2a7f21750e97cc9760e48cfec07d
3
  size 503824749