hkivancoral's picture
End of training
8aa496e
metadata
license: apache-2.0
base_model: facebook/deit-tiny-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: hushem_1x_deit_tiny_sgd_00001_fold3
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.27906976744186046

hushem_1x_deit_tiny_sgd_00001_fold3

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6900
  • Accuracy: 0.2791

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.7081 0.2791
1.7325 2.0 12 1.7072 0.2791
1.7325 3.0 18 1.7063 0.2791
1.7152 4.0 24 1.7055 0.2791
1.6813 5.0 30 1.7046 0.2791
1.6813 6.0 36 1.7038 0.2791
1.6984 7.0 42 1.7030 0.2791
1.6984 8.0 48 1.7022 0.2791
1.7131 9.0 54 1.7014 0.2791
1.7337 10.0 60 1.7007 0.2791
1.7337 11.0 66 1.7000 0.2791
1.7143 12.0 72 1.6993 0.2791
1.7143 13.0 78 1.6987 0.2791
1.6884 14.0 84 1.6981 0.2791
1.7252 15.0 90 1.6975 0.2791
1.7252 16.0 96 1.6969 0.2791
1.7269 17.0 102 1.6963 0.2791
1.7269 18.0 108 1.6958 0.2791
1.6858 19.0 114 1.6953 0.2791
1.7013 20.0 120 1.6948 0.2791
1.7013 21.0 126 1.6943 0.2791
1.7051 22.0 132 1.6939 0.2791
1.7051 23.0 138 1.6935 0.2791
1.6834 24.0 144 1.6931 0.2791
1.6977 25.0 150 1.6927 0.2791
1.6977 26.0 156 1.6924 0.2791
1.7016 27.0 162 1.6920 0.2791
1.7016 28.0 168 1.6917 0.2791
1.7242 29.0 174 1.6915 0.2791
1.6808 30.0 180 1.6912 0.2791
1.6808 31.0 186 1.6910 0.2791
1.7032 32.0 192 1.6908 0.2791
1.7032 33.0 198 1.6906 0.2791
1.6261 34.0 204 1.6905 0.2791
1.7412 35.0 210 1.6903 0.2791
1.7412 36.0 216 1.6902 0.2791
1.6899 37.0 222 1.6901 0.2791
1.6899 38.0 228 1.6901 0.2791
1.6944 39.0 234 1.6900 0.2791
1.6965 40.0 240 1.6900 0.2791
1.6965 41.0 246 1.6900 0.2791
1.6787 42.0 252 1.6900 0.2791
1.6787 43.0 258 1.6900 0.2791
1.6617 44.0 264 1.6900 0.2791
1.7215 45.0 270 1.6900 0.2791
1.7215 46.0 276 1.6900 0.2791
1.6881 47.0 282 1.6900 0.2791
1.6881 48.0 288 1.6900 0.2791
1.6823 49.0 294 1.6900 0.2791
1.7275 50.0 300 1.6900 0.2791

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0