--- license: mit tags: - generated_from_keras_callback base_model: microsoft/layoutlm-base-uncased model-index: - name: layoutlm-invoice-tf-2 results: [] --- # layoutlm-invoice-tf-2 This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.3115 - Validation Loss: 0.3451 - Train Overall Precision: 0.5831 - Train Overall Recall: 0.5567 - Train Overall F1: 0.5696 - Train Overall Accuracy: 0.8995 - Epoch: 7 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 3e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: mixed_float16 ### Training results | Train Loss | Validation Loss | Train Overall Precision | Train Overall Recall | Train Overall F1 | Train Overall Accuracy | Epoch | |:----------:|:---------------:|:-----------------------:|:--------------------:|:----------------:|:----------------------:|:-----:| | 2.3655 | 2.0025 | 0.0056 | 0.0101 | 0.0072 | 0.3973 | 0 | | 1.7668 | 1.5306 | 0.0185 | 0.0327 | 0.0237 | 0.5455 | 1 | | 1.3422 | 1.1230 | 0.0595 | 0.0831 | 0.0693 | 0.6698 | 2 | | 1.0006 | 0.9051 | 0.1393 | 0.1990 | 0.1639 | 0.7432 | 3 | | 0.7743 | 0.6738 | 0.2127 | 0.2620 | 0.2348 | 0.8158 | 4 | | 0.5996 | 0.5262 | 0.4035 | 0.4106 | 0.4070 | 0.8628 | 5 | | 0.4179 | 0.3998 | 0.4856 | 0.4685 | 0.4769 | 0.8892 | 6 | | 0.3115 | 0.3451 | 0.5831 | 0.5567 | 0.5696 | 0.8995 | 7 | ### Framework versions - Transformers 4.41.0.dev0 - TensorFlow 2.16.1 - Datasets 2.19.1 - Tokenizers 0.19.1