--- license: mit tags: - generated_from_trainer metrics: - accuracy - f1 model-index: - name: KcELECTRA-small-v2022-finetuned-in-vehicle results: [] --- # KcELECTRA-small-v2022-finetuned-in-vehicle This model is a fine-tuned version of [beomi/KcELECTRA-small-v2022](https://huggingface.co/beomi/KcELECTRA-small-v2022) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5014 - Accuracy: 0.92 - F1: 0.9010 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 2.6201 | 1.0 | 38 | 2.5909 | 0.18 | 0.0549 | | 2.5788 | 2.0 | 76 | 2.5466 | 0.18 | 0.0549 | | 2.5397 | 3.0 | 114 | 2.4976 | 0.18 | 0.0549 | | 2.4886 | 4.0 | 152 | 2.4178 | 0.3833 | 0.2516 | | 2.4062 | 5.0 | 190 | 2.3038 | 0.4267 | 0.2575 | | 2.3015 | 6.0 | 228 | 2.1798 | 0.4333 | 0.2746 | | 2.1868 | 7.0 | 266 | 2.0589 | 0.52 | 0.4121 | | 2.0713 | 8.0 | 304 | 1.9436 | 0.6133 | 0.5349 | | 1.9763 | 9.0 | 342 | 1.8359 | 0.66 | 0.6048 | | 1.8715 | 10.0 | 380 | 1.7361 | 0.72 | 0.6863 | | 1.7755 | 11.0 | 418 | 1.6402 | 0.7233 | 0.6891 | | 1.6873 | 12.0 | 456 | 1.5496 | 0.81 | 0.7774 | | 1.5828 | 13.0 | 494 | 1.4681 | 0.8433 | 0.8089 | | 1.5222 | 14.0 | 532 | 1.3870 | 0.84 | 0.8038 | | 1.4397 | 15.0 | 570 | 1.3148 | 0.88 | 0.8554 | | 1.3673 | 16.0 | 608 | 1.2461 | 0.89 | 0.8705 | | 1.3047 | 17.0 | 646 | 1.1801 | 0.91 | 0.8903 | | 1.2232 | 18.0 | 684 | 1.1209 | 0.9033 | 0.8844 | | 1.1661 | 19.0 | 722 | 1.0618 | 0.9 | 0.8817 | | 1.1104 | 20.0 | 760 | 1.0207 | 0.89 | 0.8660 | | 1.0572 | 21.0 | 798 | 0.9679 | 0.8933 | 0.8725 | | 1.0191 | 22.0 | 836 | 0.9243 | 0.8933 | 0.8722 | | 0.9548 | 23.0 | 874 | 0.8850 | 0.8967 | 0.8757 | | 0.9364 | 24.0 | 912 | 0.8429 | 0.9 | 0.8790 | | 0.871 | 25.0 | 950 | 0.8094 | 0.8933 | 0.8724 | | 0.8629 | 26.0 | 988 | 0.7773 | 0.8967 | 0.8746 | | 0.7992 | 27.0 | 1026 | 0.7540 | 0.8933 | 0.8735 | | 0.7948 | 28.0 | 1064 | 0.7234 | 0.8933 | 0.8704 | | 0.7455 | 29.0 | 1102 | 0.6967 | 0.8967 | 0.8749 | | 0.7236 | 30.0 | 1140 | 0.6760 | 0.91 | 0.8881 | | 0.6905 | 31.0 | 1178 | 0.6519 | 0.9033 | 0.8832 | | 0.6857 | 32.0 | 1216 | 0.6396 | 0.9133 | 0.8944 | | 0.6526 | 33.0 | 1254 | 0.6155 | 0.9167 | 0.8963 | | 0.6294 | 34.0 | 1292 | 0.6025 | 0.9033 | 0.8835 | | 0.6179 | 35.0 | 1330 | 0.5909 | 0.9167 | 0.8970 | | 0.6022 | 36.0 | 1368 | 0.5757 | 0.9133 | 0.8934 | | 0.5753 | 37.0 | 1406 | 0.5610 | 0.92 | 0.8999 | | 0.561 | 38.0 | 1444 | 0.5536 | 0.9167 | 0.8970 | | 0.553 | 39.0 | 1482 | 0.5417 | 0.92 | 0.8998 | | 0.5395 | 40.0 | 1520 | 0.5367 | 0.92 | 0.9018 | | 0.5402 | 41.0 | 1558 | 0.5276 | 0.92 | 0.9018 | | 0.5266 | 42.0 | 1596 | 0.5238 | 0.92 | 0.9010 | | 0.5178 | 43.0 | 1634 | 0.5182 | 0.92 | 0.9018 | | 0.52 | 44.0 | 1672 | 0.5129 | 0.92 | 0.9010 | | 0.495 | 45.0 | 1710 | 0.5069 | 0.9167 | 0.8981 | | 0.5124 | 46.0 | 1748 | 0.5054 | 0.9167 | 0.8981 | | 0.5034 | 47.0 | 1786 | 0.5038 | 0.92 | 0.9018 | | 0.5108 | 48.0 | 1824 | 0.5020 | 0.92 | 0.9018 | | 0.483 | 49.0 | 1862 | 0.5016 | 0.92 | 0.9010 | | 0.4974 | 50.0 | 1900 | 0.5014 | 0.92 | 0.9010 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2