cbertrand's picture
End of training
4a97a84
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-base-GTZAN
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7566137566137566

vit-base-GTZAN

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8328
  • Accuracy: 0.7566

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 16

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.3756 0.09 10 2.2861 0.2116
2.3051 0.19 20 2.1907 0.3439
2.1219 0.28 30 2.0214 0.3175
2.0542 0.37 40 1.9059 0.4074
1.8132 0.47 50 1.8472 0.3862
1.8854 0.56 60 1.6832 0.4603
1.6981 0.65 70 1.6008 0.4974
1.5251 0.75 80 1.4685 0.5026
1.4463 0.84 90 1.3713 0.6138
1.4335 0.93 100 1.4270 0.4974
1.1147 1.03 110 1.2793 0.5926
1.3568 1.12 120 1.3360 0.5661
1.3077 1.21 130 1.4520 0.5079
1.2801 1.31 140 1.2765 0.5661
1.2894 1.4 150 1.1949 0.6138
1.2657 1.5 160 1.1937 0.6349
0.8784 1.59 170 1.2190 0.6032
1.1575 1.68 180 1.2268 0.6138
0.9848 1.78 190 1.0572 0.6561
0.9409 1.87 200 1.1609 0.6349
0.9448 1.96 210 1.2327 0.6085
1.0819 2.06 220 1.1699 0.5820
0.7485 2.15 230 1.1041 0.6508
0.8934 2.24 240 1.1672 0.5873
0.8609 2.34 250 1.1900 0.6190
0.7935 2.43 260 1.0623 0.6402
0.8013 2.52 270 0.9873 0.6878
0.6669 2.62 280 1.0078 0.6561
0.7847 2.71 290 1.1484 0.6085
0.7222 2.8 300 1.1295 0.6243
0.7844 2.9 310 0.9414 0.7249
0.8057 2.99 320 1.0504 0.6667
0.4843 3.08 330 0.9874 0.6508
0.6766 3.18 340 1.1496 0.6508
0.4818 3.27 350 1.0968 0.6878
0.5351 3.36 360 1.1394 0.6296
0.5035 3.46 370 0.9815 0.7090
0.4032 3.55 380 1.0882 0.6402
0.639 3.64 390 1.2611 0.6085
0.5156 3.74 400 1.0376 0.6561
0.4884 3.83 410 0.9506 0.6984
0.5875 3.93 420 0.8479 0.7513
0.6982 4.02 430 1.0895 0.6825
0.3966 4.11 440 0.9709 0.6984
0.377 4.21 450 0.9754 0.6772
0.3417 4.3 460 1.1687 0.6508
0.336 4.39 470 0.9826 0.6984
0.5201 4.49 480 1.1770 0.6614
0.1737 4.58 490 1.0491 0.6878
0.2545 4.67 500 1.1352 0.6984
0.3752 4.77 510 1.0300 0.6931
0.3667 4.86 520 1.0355 0.6825
0.2797 4.95 530 0.9882 0.6984
0.1646 5.05 540 1.0728 0.6984
0.2199 5.14 550 0.8328 0.7566
0.2191 5.23 560 0.9280 0.7460
0.12 5.33 570 1.0978 0.7037
0.2608 5.42 580 1.1158 0.6878
0.2 5.51 590 1.0873 0.7354
0.1899 5.61 600 1.0560 0.7143
0.1113 5.7 610 1.1144 0.7037
0.2279 5.79 620 1.2535 0.6667
0.1563 5.89 630 1.0803 0.7354
0.2182 5.98 640 1.3904 0.6349
0.1781 6.07 650 1.3461 0.6720
0.1395 6.17 660 1.2769 0.6825
0.2308 6.26 670 1.2213 0.6931
0.1899 6.36 680 1.0948 0.7143
0.1702 6.45 690 1.2383 0.6931
0.1055 6.54 700 1.4010 0.6349
0.1151 6.64 710 1.2607 0.6720
0.2415 6.73 720 1.0520 0.7302
0.117 6.82 730 1.0548 0.7354
0.184 6.92 740 1.1872 0.6984
0.1997 7.01 750 1.1128 0.7249
0.0645 7.1 760 1.1514 0.6984
0.1025 7.2 770 1.2252 0.7037
0.0407 7.29 780 1.0571 0.7513
0.1752 7.38 790 1.0812 0.7354
0.1143 7.48 800 1.2182 0.7143
0.1542 7.57 810 1.1789 0.7143
0.0859 7.66 820 1.1392 0.7196
0.119 7.76 830 1.1568 0.7354
0.0913 7.85 840 1.1097 0.6984
0.085 7.94 850 1.1189 0.7460
0.0201 8.04 860 1.1283 0.7143
0.0509 8.13 870 1.1005 0.7407
0.0326 8.22 880 1.0490 0.7302
0.0728 8.32 890 1.2511 0.7196
0.0486 8.41 900 1.1833 0.7143
0.0645 8.5 910 0.9881 0.7725
0.0194 8.6 920 1.0412 0.7566
0.0215 8.69 930 1.2485 0.7196
0.0853 8.79 940 1.0864 0.7672
0.0412 8.88 950 1.1796 0.7249
0.0645 8.97 960 1.3152 0.6878
0.0654 9.07 970 1.2789 0.6931
0.0352 9.16 980 1.1928 0.7196
0.0137 9.25 990 1.1643 0.7354
0.0227 9.35 1000 1.2256 0.7143
0.0391 9.44 1010 1.2089 0.7196
0.0163 9.53 1020 1.3880 0.6931
0.0225 9.63 1030 1.3944 0.6931
0.0348 9.72 1040 1.3257 0.7143
0.0354 9.81 1050 1.1538 0.7460
0.0412 9.91 1060 1.2372 0.7249
0.055 10.0 1070 1.2266 0.7090
0.0115 10.09 1080 1.2353 0.7249
0.011 10.19 1090 1.2655 0.7249
0.0105 10.28 1100 1.2831 0.7354
0.0248 10.37 1110 1.3138 0.7143
0.0287 10.47 1120 1.2472 0.7196
0.017 10.56 1130 1.1517 0.7619
0.0326 10.65 1140 1.1729 0.7513
0.0298 10.75 1150 1.1991 0.7460
0.0087 10.84 1160 1.1965 0.7196
0.0104 10.93 1170 1.2006 0.7302
0.0176 11.03 1180 1.2819 0.7196
0.0088 11.12 1190 1.2860 0.7249
0.0218 11.21 1200 1.1996 0.7407
0.011 11.31 1210 1.1905 0.7407
0.0195 11.4 1220 1.1777 0.7460
0.012 11.5 1230 1.1417 0.7566
0.0075 11.59 1240 1.1429 0.7619
0.0131 11.68 1250 1.1381 0.7672
0.0078 11.78 1260 1.1562 0.7566
0.0071 11.87 1270 1.1708 0.7619
0.04 11.96 1280 1.1965 0.7513
0.0066 12.06 1290 1.2295 0.7354
0.0179 12.15 1300 1.2337 0.7354
0.0072 12.24 1310 1.2376 0.7407
0.0189 12.34 1320 1.2402 0.7354
0.0067 12.43 1330 1.2426 0.7407
0.014 12.52 1340 1.2199 0.7460
0.0065 12.62 1350 1.2070 0.7513
0.0119 12.71 1360 1.2172 0.7513
0.0065 12.8 1370 1.2299 0.7460
0.0139 12.9 1380 1.2095 0.7513
0.0195 12.99 1390 1.1914 0.7513
0.0102 13.08 1400 1.1972 0.7513
0.0162 13.18 1410 1.2006 0.7566
0.0057 13.27 1420 1.2135 0.7566
0.0099 13.36 1430 1.2060 0.7566
0.0092 13.46 1440 1.2094 0.7513
0.0059 13.55 1450 1.2153 0.7460
0.0132 13.64 1460 1.2271 0.7513
0.0224 13.74 1470 1.2394 0.7460
0.0116 13.83 1480 1.2354 0.7460
0.0096 13.93 1490 1.2316 0.7460
0.0055 14.02 1500 1.2332 0.7460
0.009 14.11 1510 1.2355 0.7460
0.0058 14.21 1520 1.2447 0.7460
0.01 14.3 1530 1.2437 0.7460
0.0055 14.39 1540 1.2422 0.7460
0.0187 14.49 1550 1.2215 0.7513
0.0103 14.58 1560 1.2178 0.7513
0.0053 14.67 1570 1.2217 0.7460
0.01 14.77 1580 1.2267 0.7460
0.0238 14.86 1590 1.2279 0.7460
0.0091 14.95 1600 1.2242 0.7460
0.0053 15.05 1610 1.2232 0.7513
0.0101 15.14 1620 1.2257 0.7460
0.0189 15.23 1630 1.2277 0.7460
0.0056 15.33 1640 1.2336 0.7460
0.0052 15.42 1650 1.2353 0.7460
0.0054 15.51 1660 1.2359 0.7460
0.0054 15.61 1670 1.2362 0.7460
0.0102 15.7 1680 1.2348 0.7513
0.0193 15.79 1690 1.2326 0.7513
0.0104 15.89 1700 1.2315 0.7513
0.0095 15.98 1710 1.2312 0.7513

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.2.0.dev20230912+cu121
  • Datasets 2.14.5
  • Tokenizers 0.13.3