Edit model card

convnext_manuscript_iiif

This model is a fine-tuned version of facebook/convnext-base-224-22k on the davanstrien/iiif_manuscripts_label_ge_50 dataset. It achieves the following results on the evaluation set:

  • Loss: 5.5856
  • F1: 0.0037

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1
6.5753 1.0 2038 6.4121 0.0016
5.9865 2.0 4076 5.9466 0.0021
5.6521 3.0 6114 5.7645 0.0029
5.3123 4.0 8152 5.6890 0.0033
5.0337 5.0 10190 5.6692 0.0034
4.743 6.0 12228 5.5856 0.0037
4.4387 7.0 14266 5.5969 0.0042
4.1422 8.0 16304 5.6711 0.0043
3.8372 9.0 18342 5.6761 0.0044
3.5244 10.0 20380 5.8469 0.0042
3.2321 11.0 22418 5.8774 0.0045
2.9004 12.0 24456 6.1186 0.0047
2.5937 13.0 26494 6.2398 0.0046
2.2983 14.0 28532 6.3732 0.0049
2.0611 15.0 30570 6.5024 0.0045
1.8153 16.0 32608 6.6585 0.0047
1.6075 17.0 34646 6.8333 0.0043
1.4342 18.0 36684 6.9529 0.0044
1.2614 19.0 38722 7.1129 0.0046
1.1463 20.0 40760 7.1977 0.0039
1.0387 21.0 42798 7.2700 0.0044
0.9635 22.0 44836 7.3375 0.0040
0.8872 23.0 46874 7.4003 0.0039
0.8156 24.0 48912 7.4884 0.0039
0.7544 25.0 50950 7.4764 0.0039
0.6893 26.0 52988 7.5153 0.0042
0.6767 27.0 55026 7.5427 0.0043
0.6098 28.0 57064 7.5547 0.0042
0.5871 29.0 59102 7.5533 0.0041
0.5696 30.0 61140 7.5595 0.0041

Framework versions

  • Transformers 4.18.0.dev0
  • Pytorch 1.10.2+cu102
  • Datasets 1.18.3
  • Tokenizers 0.11.6
Downloads last month
10
Safetensors
Model size
104M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for davanstrien/convnext_manuscript_iiif

Finetuned
(8)
this model