File size: 10,604 Bytes
ffdb712
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv3-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-08-10
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-08-10

This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 10.1413
- Accuracy: 0.7325
- Exit 0 Accuracy: 0.1725
- Exit 1 Accuracy: 0.2175
- Exit 2 Accuracy: 0.6075
- Exit 3 Accuracy: 0.715
- Exit 4 Accuracy: 0.735

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 24
- total_train_batch_size: 48
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Exit 0 Accuracy | Exit 1 Accuracy | Exit 2 Accuracy | Exit 3 Accuracy | Exit 4 Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:|
| No log        | 0.96  | 16   | 16.4817         | 0.17     | 0.0825          | 0.045           | 0.105           | 0.0625          | 0.0625          |
| No log        | 1.98  | 33   | 15.9950         | 0.2675   | 0.1             | 0.1325          | 0.195           | 0.1775          | 0.2425          |
| No log        | 3.0   | 50   | 14.9811         | 0.475    | 0.1025          | 0.1475          | 0.24            | 0.29            | 0.4425          |
| No log        | 3.96  | 66   | 14.0127         | 0.5675   | 0.105           | 0.1425          | 0.27            | 0.3975          | 0.505           |
| No log        | 4.98  | 83   | 13.3047         | 0.6075   | 0.125           | 0.1425          | 0.3175          | 0.43            | 0.595           |
| No log        | 6.0   | 100  | 12.7573         | 0.6125   | 0.125           | 0.1475          | 0.325           | 0.495           | 0.615           |
| No log        | 6.96  | 116  | 12.3656         | 0.645    | 0.1175          | 0.155           | 0.33            | 0.5175          | 0.6375          |
| No log        | 7.98  | 133  | 11.9582         | 0.6625   | 0.115           | 0.16            | 0.3525          | 0.5725          | 0.67            |
| No log        | 9.0   | 150  | 11.6533         | 0.6825   | 0.1225          | 0.16            | 0.375           | 0.6             | 0.7075          |
| No log        | 9.96  | 166  | 11.5143         | 0.685    | 0.1525          | 0.1625          | 0.38            | 0.6             | 0.675           |
| No log        | 10.98 | 183  | 11.3152         | 0.6625   | 0.115           | 0.1625          | 0.41            | 0.6225          | 0.6725          |
| No log        | 12.0  | 200  | 11.0708         | 0.695    | 0.11            | 0.1625          | 0.425           | 0.6225          | 0.7075          |
| No log        | 12.96 | 216  | 11.0412         | 0.6975   | 0.1125          | 0.1575          | 0.4             | 0.645           | 0.685           |
| No log        | 13.98 | 233  | 10.8782         | 0.7125   | 0.1425          | 0.165           | 0.4275          | 0.6325          | 0.7075          |
| No log        | 15.0  | 250  | 10.7282         | 0.7075   | 0.115           | 0.165           | 0.4225          | 0.65            | 0.7175          |
| No log        | 15.96 | 266  | 10.7039         | 0.695    | 0.15            | 0.16            | 0.4375          | 0.6375          | 0.69            |
| No log        | 16.98 | 283  | 10.5455         | 0.7125   | 0.13            | 0.165           | 0.4375          | 0.6675          | 0.715           |
| No log        | 18.0  | 300  | 10.5214         | 0.7075   | 0.1275          | 0.17            | 0.45            | 0.6825          | 0.7075          |
| No log        | 18.96 | 316  | 10.4995         | 0.715    | 0.155           | 0.1725          | 0.4525          | 0.68            | 0.7125          |
| No log        | 19.98 | 333  | 10.3224         | 0.725    | 0.1475          | 0.1825          | 0.46            | 0.68            | 0.7225          |
| No log        | 21.0  | 350  | 10.4247         | 0.71     | 0.1425          | 0.1825          | 0.4625          | 0.68            | 0.71            |
| No log        | 21.96 | 366  | 10.3881         | 0.705    | 0.1375          | 0.1825          | 0.46            | 0.66            | 0.7125          |
| No log        | 22.98 | 383  | 10.3065         | 0.715    | 0.1375          | 0.1875          | 0.465           | 0.6925          | 0.7225          |
| No log        | 24.0  | 400  | 10.1955         | 0.72     | 0.145           | 0.1875          | 0.4725          | 0.695           | 0.7225          |
| No log        | 24.96 | 416  | 10.1607         | 0.72     | 0.165           | 0.19            | 0.4925          | 0.7075          | 0.7175          |
| No log        | 25.98 | 433  | 10.2416         | 0.72     | 0.14            | 0.195           | 0.48            | 0.7025          | 0.7275          |
| No log        | 27.0  | 450  | 10.1321         | 0.715    | 0.145           | 0.1875          | 0.4925          | 0.7125          | 0.72            |
| No log        | 27.96 | 466  | 10.1982         | 0.7275   | 0.145           | 0.1875          | 0.4875          | 0.7075          | 0.73            |
| No log        | 28.98 | 483  | 10.2237         | 0.72     | 0.1575          | 0.19            | 0.515           | 0.7             | 0.7225          |
| 10.0174       | 30.0  | 500  | 10.1426         | 0.7175   | 0.1675          | 0.1975          | 0.5275          | 0.7125          | 0.7225          |
| 10.0174       | 30.96 | 516  | 10.1056         | 0.7325   | 0.14            | 0.1975          | 0.515           | 0.715           | 0.7325          |
| 10.0174       | 31.98 | 533  | 10.1616         | 0.7225   | 0.1525          | 0.195           | 0.5275          | 0.7175          | 0.72            |
| 10.0174       | 33.0  | 550  | 10.1053         | 0.7325   | 0.1425          | 0.195           | 0.525           | 0.7125          | 0.7275          |
| 10.0174       | 33.96 | 566  | 10.1581         | 0.7175   | 0.165           | 0.2             | 0.5375          | 0.71            | 0.71            |
| 10.0174       | 34.98 | 583  | 10.0835         | 0.7225   | 0.15            | 0.2025          | 0.5375          | 0.715           | 0.7225          |
| 10.0174       | 36.0  | 600  | 10.1349         | 0.725    | 0.1425          | 0.2             | 0.5375          | 0.7025          | 0.725           |
| 10.0174       | 36.96 | 616  | 10.0424         | 0.7325   | 0.1625          | 0.1975          | 0.545           | 0.7225          | 0.735           |
| 10.0174       | 37.98 | 633  | 10.0692         | 0.73     | 0.155           | 0.195           | 0.5525          | 0.7225          | 0.74            |
| 10.0174       | 39.0  | 650  | 10.0838         | 0.7325   | 0.1625          | 0.1975          | 0.56            | 0.7225          | 0.7375          |
| 10.0174       | 39.96 | 666  | 10.1160         | 0.7275   | 0.1675          | 0.1975          | 0.5575          | 0.7225          | 0.725           |
| 10.0174       | 40.98 | 683  | 10.0971         | 0.735    | 0.1675          | 0.1975          | 0.5625          | 0.7175          | 0.73            |
| 10.0174       | 42.0  | 700  | 10.1207         | 0.73     | 0.165           | 0.2             | 0.5775          | 0.715           | 0.7275          |
| 10.0174       | 42.96 | 716  | 10.1448         | 0.7325   | 0.175           | 0.205           | 0.5775          | 0.7175          | 0.73            |
| 10.0174       | 43.98 | 733  | 10.0945         | 0.735    | 0.1675          | 0.21            | 0.5775          | 0.7175          | 0.735           |
| 10.0174       | 45.0  | 750  | 10.1789         | 0.73     | 0.17            | 0.2175          | 0.5775          | 0.7125          | 0.7275          |
| 10.0174       | 45.96 | 766  | 10.1274         | 0.735    | 0.175           | 0.215           | 0.5875          | 0.7075          | 0.735           |
| 10.0174       | 46.98 | 783  | 10.1656         | 0.735    | 0.155           | 0.2125          | 0.5875          | 0.7125          | 0.7375          |
| 10.0174       | 48.0  | 800  | 10.1557         | 0.7275   | 0.16            | 0.215           | 0.6025          | 0.715           | 0.7325          |
| 10.0174       | 48.96 | 816  | 10.1436         | 0.74     | 0.165           | 0.215           | 0.6025          | 0.7175          | 0.735           |
| 10.0174       | 49.98 | 833  | 10.1474         | 0.7325   | 0.1625          | 0.215           | 0.6             | 0.715           | 0.735           |
| 10.0174       | 51.0  | 850  | 10.1647         | 0.7275   | 0.1725          | 0.2175          | 0.605           | 0.7175          | 0.7325          |
| 10.0174       | 51.96 | 866  | 10.1375         | 0.73     | 0.1775          | 0.215           | 0.6025          | 0.7125          | 0.7375          |
| 10.0174       | 52.98 | 883  | 10.1458         | 0.7325   | 0.1675          | 0.2175          | 0.605           | 0.7125          | 0.7375          |
| 10.0174       | 54.0  | 900  | 10.1527         | 0.7275   | 0.175           | 0.22            | 0.6025          | 0.715           | 0.73            |
| 10.0174       | 54.96 | 916  | 10.1349         | 0.7325   | 0.175           | 0.2175          | 0.6025          | 0.72            | 0.735           |
| 10.0174       | 55.98 | 933  | 10.1376         | 0.7325   | 0.175           | 0.22            | 0.6025          | 0.72            | 0.7325          |
| 10.0174       | 57.0  | 950  | 10.1413         | 0.7325   | 0.1725          | 0.2175          | 0.6075          | 0.715           | 0.7325          |
| 10.0174       | 57.6  | 960  | 10.1413         | 0.7325   | 0.1725          | 0.2175          | 0.6075          | 0.715           | 0.735           |


### Framework versions

- Transformers 4.31.0
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3