vit-transformer3 / README.md
masafresh's picture
Model save
f1d7e3c verified
metadata
library_name: transformers
license: apache-2.0
base_model: google/vit-large-patch16-224-in21k
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: vit-transformer3
    results: []

vit-transformer3

This model is a fine-tuned version of google/vit-large-patch16-224-in21k on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8890
  • Accuracy: 0.6833
  • F1: 0.6840

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.8752 0.9552 16 0.9886 0.6667 0.6484
0.7728 1.9701 33 0.6862 0.5667 0.4099
0.7065 2.9851 50 0.6627 0.6333 0.6132
0.6845 4.0 67 0.7065 0.55 0.4922
0.6513 4.9552 83 0.7202 0.4667 0.3905
0.6567 5.9701 100 0.7677 0.5333 0.4667
0.6539 6.9851 117 0.6269 0.6167 0.6047
0.7025 8.0 134 0.6838 0.65 0.6107
0.6698 8.9552 150 0.6313 0.6667 0.6337
0.6986 9.9701 167 0.6200 0.6667 0.6484
0.6811 10.9851 184 0.5869 0.6833 0.6840
0.6132 12.0 201 0.5881 0.6833 0.6687
0.7235 12.9552 217 0.5732 0.65 0.6274
0.5768 13.9701 234 0.5802 0.6833 0.6825
0.5307 14.9851 251 0.6610 0.7 0.7010
0.552 16.0 268 0.6229 0.7333 0.7296
0.5548 16.9552 284 0.6186 0.7167 0.7036
0.4863 17.9701 301 0.8409 0.5667 0.5366
0.5048 18.9851 318 1.0019 0.4833 0.4015
0.4919 20.0 335 0.6475 0.7333 0.7333
0.4788 20.9552 351 0.6931 0.6333 0.6282
0.5076 21.9701 368 0.6798 0.7 0.6983
0.5047 22.9851 385 0.6784 0.7 0.7
0.3477 24.0 402 0.8261 0.7 0.6983
0.4508 24.9552 418 0.6846 0.6833 0.6825
0.4948 25.9701 435 0.7509 0.6833 0.6804
0.3661 26.9851 452 0.7321 0.6667 0.6678
0.3072 28.0 469 0.8338 0.6833 0.6839
0.3573 28.9552 485 0.9031 0.65 0.6434
0.3828 29.9701 502 0.8582 0.6667 0.6667
0.2931 30.9851 519 0.7648 0.65 0.6515
0.3193 32.0 536 0.9218 0.6333 0.6333
0.2783 32.9552 552 0.8452 0.7 0.7013
0.2816 33.9701 569 0.8310 0.6833 0.6735
0.3018 34.9851 586 0.8437 0.7 0.6960
0.2256 36.0 603 1.0684 0.65 0.6507
0.2609 36.9552 619 0.9117 0.65 0.6491
0.2198 37.9701 636 1.1688 0.5833 0.5652
0.306 38.9851 653 0.9001 0.6167 0.6130
0.2243 40.0 670 1.2253 0.6333 0.6313
0.3482 40.9552 686 1.0028 0.65 0.6491
0.196 41.9701 703 0.8747 0.6667 0.6682
0.2261 42.9851 720 1.3642 0.65 0.6468
0.2802 44.0 737 1.3271 0.5833 0.5704
0.1965 44.9552 753 1.3784 0.6 0.6018
0.2198 45.9701 770 1.3224 0.6667 0.6682
0.1852 46.9851 787 1.5364 0.6333 0.6243
0.197 48.0 804 1.5706 0.6167 0.6174
0.1932 48.9552 820 1.3610 0.6667 0.6648
0.1495 49.9701 837 1.4687 0.6167 0.6174
0.1404 50.9851 854 1.3438 0.7 0.6983
0.1275 52.0 871 1.4674 0.6 0.5978
0.1545 52.9552 887 1.3120 0.6167 0.6183
0.147 53.9701 904 1.5816 0.6167 0.6183
0.1541 54.9851 921 1.5117 0.6667 0.6678
0.1283 56.0 938 1.5965 0.6667 0.6678
0.1715 56.9552 954 1.6750 0.65 0.6491
0.1513 57.9701 971 1.9170 0.5333 0.5164
0.2349 58.9851 988 1.5358 0.6333 0.6346
0.1248 60.0 1005 1.6686 0.6833 0.6840
0.1076 60.9552 1021 1.7018 0.6333 0.6346
0.1431 61.9701 1038 1.9088 0.6333 0.6333
0.0838 62.9851 1055 1.8821 0.6333 0.6346
0.0989 64.0 1072 1.6053 0.65 0.6491
0.1323 64.9552 1088 1.7114 0.6333 0.6312
0.0908 65.9701 1105 1.7326 0.65 0.6491
0.2056 66.9851 1122 1.7166 0.6167 0.6130
0.0752 68.0 1139 1.8009 0.65 0.6467
0.1116 68.9552 1155 1.6964 0.6667 0.6678
0.0821 69.9701 1172 1.7557 0.6167 0.6094
0.1284 70.9851 1189 1.8039 0.65 0.6491
0.1905 72.0 1206 1.7951 0.6167 0.6094
0.1031 72.9552 1222 1.6888 0.6667 0.6648
0.0706 73.9701 1239 1.8992 0.65 0.6467
0.0944 74.9851 1256 1.6965 0.6833 0.6840
0.1042 76.0 1273 1.6756 0.6833 0.6825
0.1599 76.9552 1289 1.4360 0.7333 0.7342
0.0896 77.9701 1306 1.5759 0.65 0.6467
0.0674 78.9851 1323 1.7071 0.7 0.7010
0.1133 80.0 1340 1.6499 0.6833 0.6840
0.0506 80.9552 1356 1.6546 0.6833 0.6825
0.1015 81.9701 1373 1.6468 0.7 0.7013
0.0923 82.9851 1390 1.8567 0.6667 0.6622
0.0752 84.0 1407 1.8140 0.7 0.7010
0.0768 84.9552 1423 1.8225 0.6667 0.6678
0.0683 85.9701 1440 1.8094 0.6833 0.6840
0.0454 86.9851 1457 1.8892 0.65 0.6491
0.054 88.0 1474 1.8180 0.7 0.7010
0.0449 88.9552 1490 1.7891 0.7333 0.7345
0.0645 89.9701 1507 1.8262 0.7 0.7010
0.0632 90.9851 1524 1.8187 0.7167 0.7179
0.0795 92.0 1541 1.7941 0.7333 0.7345
0.0923 92.9552 1557 1.8340 0.6833 0.6840
0.0486 93.9701 1574 1.8843 0.6667 0.6667
0.0821 94.9851 1591 1.8907 0.6667 0.6667
0.0384 95.5224 1600 1.8890 0.6833 0.6840

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.1