|
--- |
|
license: cc-by-nc-sa-4.0 |
|
base_model: microsoft/layoutlmv3-base |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- wildreceipt |
|
metrics: |
|
- precision |
|
- recall |
|
- f1 |
|
- accuracy |
|
model-index: |
|
- name: layoutlmv3-finetuned-wildreceipt |
|
results: |
|
- task: |
|
name: Token Classification |
|
type: token-classification |
|
dataset: |
|
name: wildreceipt |
|
type: wildreceipt |
|
config: WildReceipt |
|
split: test |
|
args: WildReceipt |
|
metrics: |
|
- name: Precision |
|
type: precision |
|
value: 0.8791872597473915 |
|
- name: Recall |
|
type: recall |
|
value: 0.8814865794907089 |
|
- name: F1 |
|
type: f1 |
|
value: 0.8803354182418035 |
|
- name: Accuracy |
|
type: accuracy |
|
value: 0.9270261366132221 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# layoutlmv3-finetuned-wildreceipt |
|
|
|
This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the wildreceipt dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.3081 |
|
- Precision: 0.8792 |
|
- Recall: 0.8815 |
|
- F1: 0.8803 |
|
- Accuracy: 0.9270 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 1e-05 |
|
- train_batch_size: 4 |
|
- eval_batch_size: 4 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- training_steps: 4000 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |
|
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| |
|
| No log | 0.32 | 100 | 1.3430 | 0.5041 | 0.1959 | 0.2821 | 0.6414 | |
|
| No log | 0.63 | 200 | 0.8931 | 0.6739 | 0.5367 | 0.5975 | 0.7786 | |
|
| No log | 0.95 | 300 | 0.6793 | 0.7332 | 0.6410 | 0.6840 | 0.8273 | |
|
| No log | 1.26 | 400 | 0.5804 | 0.7659 | 0.7090 | 0.7364 | 0.8507 | |
|
| 1.0357 | 1.58 | 500 | 0.4876 | 0.7919 | 0.7551 | 0.7731 | 0.8723 | |
|
| 1.0357 | 1.89 | 600 | 0.4417 | 0.8009 | 0.7997 | 0.8003 | 0.8857 | |
|
| 1.0357 | 2.21 | 700 | 0.3937 | 0.8256 | 0.8200 | 0.8228 | 0.8973 | |
|
| 1.0357 | 2.52 | 800 | 0.3904 | 0.8143 | 0.8321 | 0.8231 | 0.8958 | |
|
| 1.0357 | 2.84 | 900 | 0.3638 | 0.8462 | 0.8211 | 0.8334 | 0.9010 | |
|
| 0.3989 | 3.15 | 1000 | 0.3586 | 0.8386 | 0.8447 | 0.8417 | 0.9055 | |
|
| 0.3989 | 3.47 | 1100 | 0.3227 | 0.8382 | 0.8564 | 0.8472 | 0.9104 | |
|
| 0.3989 | 3.79 | 1200 | 0.3120 | 0.8538 | 0.8522 | 0.8530 | 0.9119 | |
|
| 0.3989 | 4.1 | 1300 | 0.3283 | 0.8498 | 0.8559 | 0.8528 | 0.9117 | |
|
| 0.3989 | 4.42 | 1400 | 0.3084 | 0.8595 | 0.8606 | 0.8600 | 0.9165 | |
|
| 0.2727 | 4.73 | 1500 | 0.3026 | 0.8552 | 0.8666 | 0.8609 | 0.9159 | |
|
| 0.2727 | 5.05 | 1600 | 0.3052 | 0.8633 | 0.8537 | 0.8585 | 0.9165 | |
|
| 0.2727 | 5.36 | 1700 | 0.3052 | 0.8505 | 0.8747 | 0.8625 | 0.9165 | |
|
| 0.2727 | 5.68 | 1800 | 0.3040 | 0.8579 | 0.8690 | 0.8634 | 0.9164 | |
|
| 0.2727 | 5.99 | 1900 | 0.2926 | 0.8717 | 0.8696 | 0.8707 | 0.9205 | |
|
| 0.2059 | 6.31 | 2000 | 0.3004 | 0.8646 | 0.8753 | 0.8699 | 0.9207 | |
|
| 0.2059 | 6.62 | 2100 | 0.2973 | 0.8711 | 0.8742 | 0.8726 | 0.9215 | |
|
| 0.2059 | 6.94 | 2200 | 0.3010 | 0.8650 | 0.8761 | 0.8705 | 0.9214 | |
|
| 0.2059 | 7.26 | 2300 | 0.3028 | 0.8654 | 0.8760 | 0.8706 | 0.9214 | |
|
| 0.2059 | 7.57 | 2400 | 0.2956 | 0.8769 | 0.8769 | 0.8769 | 0.9260 | |
|
| 0.1617 | 7.89 | 2500 | 0.2871 | 0.8746 | 0.8778 | 0.8762 | 0.9266 | |
|
| 0.1617 | 8.2 | 2600 | 0.3092 | 0.8632 | 0.8797 | 0.8714 | 0.9226 | |
|
| 0.1617 | 8.52 | 2700 | 0.3042 | 0.8834 | 0.8738 | 0.8786 | 0.9265 | |
|
| 0.1617 | 8.83 | 2800 | 0.3092 | 0.8672 | 0.8793 | 0.8732 | 0.9224 | |
|
| 0.1617 | 9.15 | 2900 | 0.3014 | 0.8738 | 0.8841 | 0.8789 | 0.9256 | |
|
| 0.1359 | 9.46 | 3000 | 0.3038 | 0.8763 | 0.8760 | 0.8762 | 0.9249 | |
|
| 0.1359 | 9.78 | 3100 | 0.3087 | 0.8730 | 0.8797 | 0.8763 | 0.9241 | |
|
| 0.1359 | 10.09 | 3200 | 0.3021 | 0.8740 | 0.8812 | 0.8776 | 0.9251 | |
|
| 0.1359 | 10.41 | 3300 | 0.2975 | 0.8790 | 0.8836 | 0.8812 | 0.9268 | |
|
| 0.1359 | 10.73 | 3400 | 0.3121 | 0.8734 | 0.8809 | 0.8771 | 0.9254 | |
|
| 0.1192 | 11.04 | 3500 | 0.3111 | 0.8812 | 0.8794 | 0.8803 | 0.9260 | |
|
| 0.1192 | 11.36 | 3600 | 0.3101 | 0.8785 | 0.8790 | 0.8788 | 0.9261 | |
|
| 0.1192 | 11.67 | 3700 | 0.3082 | 0.8790 | 0.8829 | 0.8809 | 0.9275 | |
|
| 0.1192 | 11.99 | 3800 | 0.3081 | 0.8822 | 0.8830 | 0.8826 | 0.9276 | |
|
| 0.1192 | 12.3 | 3900 | 0.3100 | 0.8800 | 0.8809 | 0.8805 | 0.9269 | |
|
| 0.1065 | 12.62 | 4000 | 0.3081 | 0.8792 | 0.8815 | 0.8803 | 0.9270 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.32.0.dev0 |
|
- Pytorch 2.0.1+cu118 |
|
- Datasets 2.14.3 |
|
- Tokenizers 0.13.3 |
|
|