metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224-pt22k-ft22k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: finetuned-FER2013
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.7081156391501219
finetuned-FER2013
This model is a fine-tuned version of microsoft/beit-base-patch16-224-pt22k-ft22k on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.8366
- Accuracy: 0.7081
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.8119 | 1.0 | 202 | 1.7993 | 0.3079 |
1.6155 | 2.0 | 404 | 1.5446 | 0.4302 |
1.4279 | 3.0 | 606 | 1.3084 | 0.5301 |
1.3222 | 4.0 | 808 | 1.1817 | 0.5590 |
1.2532 | 5.0 | 1010 | 1.1026 | 0.5789 |
1.2019 | 6.0 | 1212 | 1.0432 | 0.5998 |
1.2037 | 7.0 | 1414 | 1.0030 | 0.6137 |
1.1757 | 8.0 | 1616 | 0.9873 | 0.6235 |
1.1359 | 9.0 | 1818 | 0.9377 | 0.6423 |
1.1282 | 10.0 | 2020 | 0.9231 | 0.6486 |
1.1019 | 11.0 | 2222 | 0.9011 | 0.6562 |
1.0494 | 12.0 | 2424 | 0.8968 | 0.6545 |
0.9951 | 13.0 | 2626 | 0.8876 | 0.6607 |
1.0121 | 14.0 | 2828 | 0.8720 | 0.6695 |
1.0571 | 15.0 | 3030 | 0.8776 | 0.6691 |
1.0049 | 16.0 | 3232 | 0.8627 | 0.6733 |
0.988 | 17.0 | 3434 | 0.8639 | 0.6719 |
0.9955 | 18.0 | 3636 | 0.8397 | 0.6806 |
0.9381 | 19.0 | 3838 | 0.8430 | 0.6820 |
0.9911 | 20.0 | 4040 | 0.8370 | 0.6837 |
0.9305 | 21.0 | 4242 | 0.8373 | 0.6837 |
0.9653 | 22.0 | 4444 | 0.8283 | 0.6883 |
0.9134 | 23.0 | 4646 | 0.8289 | 0.6879 |
0.9098 | 24.0 | 4848 | 0.8365 | 0.6837 |
0.8761 | 25.0 | 5050 | 0.8190 | 0.6869 |
0.9067 | 26.0 | 5252 | 0.8303 | 0.6876 |
0.8765 | 27.0 | 5454 | 0.8188 | 0.6942 |
0.8486 | 28.0 | 5656 | 0.8142 | 0.6959 |
0.9357 | 29.0 | 5858 | 0.8114 | 0.6984 |
0.9037 | 30.0 | 6060 | 0.8150 | 0.6917 |
0.8758 | 31.0 | 6262 | 0.8165 | 0.6931 |
0.8688 | 32.0 | 6464 | 0.8061 | 0.6994 |
0.8736 | 33.0 | 6666 | 0.8056 | 0.6994 |
0.8785 | 34.0 | 6868 | 0.8045 | 0.6991 |
0.8292 | 35.0 | 7070 | 0.8095 | 0.6987 |
0.8407 | 36.0 | 7272 | 0.8096 | 0.6956 |
0.8609 | 37.0 | 7474 | 0.8137 | 0.6984 |
0.9055 | 38.0 | 7676 | 0.8054 | 0.7018 |
0.8355 | 39.0 | 7878 | 0.8080 | 0.6980 |
0.8391 | 40.0 | 8080 | 0.8087 | 0.6966 |
0.7987 | 41.0 | 8282 | 0.8041 | 0.6998 |
0.818 | 42.0 | 8484 | 0.8070 | 0.7039 |
0.7836 | 43.0 | 8686 | 0.8091 | 0.7025 |
0.8348 | 44.0 | 8888 | 0.8047 | 0.7025 |
0.8205 | 45.0 | 9090 | 0.8076 | 0.7025 |
0.8023 | 46.0 | 9292 | 0.8056 | 0.7053 |
0.8241 | 47.0 | 9494 | 0.8022 | 0.7039 |
0.763 | 48.0 | 9696 | 0.8079 | 0.6994 |
0.7422 | 49.0 | 9898 | 0.8062 | 0.7039 |
0.7762 | 50.0 | 10100 | 0.8090 | 0.6998 |
0.7786 | 51.0 | 10302 | 0.8122 | 0.6994 |
0.8027 | 52.0 | 10504 | 0.8129 | 0.7043 |
0.7966 | 53.0 | 10706 | 0.8094 | 0.7039 |
0.8103 | 54.0 | 10908 | 0.8107 | 0.7039 |
0.7827 | 55.0 | 11110 | 0.8126 | 0.7057 |
0.7949 | 56.0 | 11312 | 0.8104 | 0.7119 |
0.7511 | 57.0 | 11514 | 0.8122 | 0.7050 |
0.7727 | 58.0 | 11716 | 0.8123 | 0.7078 |
0.7723 | 59.0 | 11918 | 0.8194 | 0.7015 |
0.7796 | 60.0 | 12120 | 0.8193 | 0.7053 |
0.7768 | 61.0 | 12322 | 0.8159 | 0.7029 |
0.7604 | 62.0 | 12524 | 0.8081 | 0.7085 |
0.7784 | 63.0 | 12726 | 0.8169 | 0.7106 |
0.7235 | 64.0 | 12928 | 0.8131 | 0.7015 |
0.7384 | 65.0 | 13130 | 0.8149 | 0.7085 |
0.6638 | 66.0 | 13332 | 0.8192 | 0.7078 |
0.6998 | 67.0 | 13534 | 0.8243 | 0.7113 |
0.7249 | 68.0 | 13736 | 0.8200 | 0.7015 |
0.6809 | 69.0 | 13938 | 0.8140 | 0.7081 |
0.701 | 70.0 | 14140 | 0.8177 | 0.7095 |
0.7122 | 71.0 | 14342 | 0.8245 | 0.7053 |
0.7269 | 72.0 | 14544 | 0.8245 | 0.7050 |
0.6973 | 73.0 | 14746 | 0.8207 | 0.7095 |
0.7241 | 74.0 | 14948 | 0.8210 | 0.7057 |
0.7397 | 75.0 | 15150 | 0.8230 | 0.7060 |
0.6832 | 76.0 | 15352 | 0.8308 | 0.7057 |
0.7213 | 77.0 | 15554 | 0.8256 | 0.7025 |
0.7115 | 78.0 | 15756 | 0.8291 | 0.7057 |
0.688 | 79.0 | 15958 | 0.8337 | 0.7088 |
0.6997 | 80.0 | 16160 | 0.8312 | 0.7060 |
0.6924 | 81.0 | 16362 | 0.8321 | 0.7053 |
0.7382 | 82.0 | 16564 | 0.8340 | 0.7050 |
0.7513 | 83.0 | 16766 | 0.8320 | 0.7015 |
0.656 | 84.0 | 16968 | 0.8389 | 0.7053 |
0.6503 | 85.0 | 17170 | 0.8321 | 0.7085 |
0.6661 | 86.0 | 17372 | 0.8355 | 0.7092 |
0.7026 | 87.0 | 17574 | 0.8339 | 0.7088 |
0.76 | 88.0 | 17776 | 0.8361 | 0.7092 |
0.696 | 89.0 | 17978 | 0.8343 | 0.7106 |
0.6713 | 90.0 | 18180 | 0.8337 | 0.7106 |
0.6621 | 91.0 | 18382 | 0.8349 | 0.7057 |
0.7042 | 92.0 | 18584 | 0.8360 | 0.7085 |
0.7087 | 93.0 | 18786 | 0.8353 | 0.7085 |
0.64 | 94.0 | 18988 | 0.8371 | 0.7088 |
0.659 | 95.0 | 19190 | 0.8376 | 0.7071 |
0.6246 | 96.0 | 19392 | 0.8376 | 0.7088 |
0.6797 | 97.0 | 19594 | 0.8368 | 0.7092 |
0.6652 | 98.0 | 19796 | 0.8376 | 0.7092 |
0.629 | 99.0 | 19998 | 0.8370 | 0.7088 |
0.6762 | 100.0 | 20200 | 0.8366 | 0.7081 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0