Edit model card

PhoBertLexical-finetuned_70KURL_not_host

This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4681
  • Accuracy: 0.9143
  • F1: 0.9147

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.2326 200 0.2934 0.8767 0.8768
No log 0.4651 400 0.2560 0.8910 0.8910
No log 0.6977 600 0.2634 0.8938 0.8953
No log 0.9302 800 0.2293 0.9010 0.9018
0.3037 1.1628 1000 0.2435 0.8990 0.9010
0.3037 1.3953 1200 0.2281 0.9073 0.9078
0.3037 1.6279 1400 0.2286 0.9038 0.9056
0.3037 1.8605 1600 0.2566 0.8971 0.8998
0.2196 2.0930 1800 0.2380 0.9102 0.9093
0.2196 2.3256 2000 0.2344 0.9044 0.9062
0.2196 2.5581 2200 0.2245 0.9108 0.9121
0.2196 2.7907 2400 0.2144 0.9108 0.9113
0.1815 3.0233 2600 0.2328 0.9123 0.9125
0.1815 3.2558 2800 0.2300 0.9073 0.9076
0.1815 3.4884 3000 0.2308 0.9128 0.9131
0.1815 3.7209 3200 0.2210 0.9104 0.9118
0.1815 3.9535 3400 0.2362 0.9120 0.9134
0.1522 4.1860 3600 0.2372 0.9156 0.9165
0.1522 4.4186 3800 0.2270 0.9158 0.9159
0.1522 4.6512 4000 0.2409 0.9148 0.9147
0.1522 4.8837 4200 0.2461 0.9138 0.9147
0.1274 5.1163 4400 0.2397 0.9143 0.9150
0.1274 5.3488 4600 0.2467 0.9175 0.9178
0.1274 5.5814 4800 0.2394 0.9144 0.9144
0.1274 5.8140 5000 0.2712 0.9153 0.9163
0.1062 6.0465 5200 0.2551 0.9131 0.9136
0.1062 6.2791 5400 0.2861 0.9147 0.9152
0.1062 6.5116 5600 0.2834 0.9094 0.9109
0.1062 6.7442 5800 0.2624 0.9161 0.9167
0.1062 6.9767 6000 0.2752 0.9142 0.9139
0.0911 7.2093 6200 0.2836 0.9145 0.9152
0.0911 7.4419 6400 0.3296 0.9107 0.9116
0.0911 7.6744 6600 0.2723 0.9126 0.9132
0.0911 7.9070 6800 0.3090 0.9108 0.9119
0.0764 8.1395 7000 0.3215 0.9120 0.9128
0.0764 8.3721 7200 0.2989 0.9150 0.9156
0.0764 8.6047 7400 0.3112 0.9132 0.9137
0.0764 8.8372 7600 0.3274 0.9158 0.9154
0.0668 9.0698 7800 0.3578 0.9150 0.9156
0.0668 9.3023 8000 0.3280 0.9181 0.9180
0.0668 9.5349 8200 0.3267 0.9137 0.9140
0.0668 9.7674 8400 0.3287 0.9152 0.9159
0.0584 10.0 8600 0.3252 0.9156 0.9158
0.0584 10.2326 8800 0.3461 0.9113 0.9118
0.0584 10.4651 9000 0.3692 0.9150 0.9150
0.0584 10.6977 9200 0.3772 0.9134 0.9138
0.0584 10.9302 9400 0.3703 0.9117 0.9126
0.0521 11.1628 9600 0.3797 0.9168 0.9168
0.0521 11.3953 9800 0.3760 0.9130 0.9135
0.0521 11.6279 10000 0.3772 0.9148 0.9150
0.0521 11.8605 10200 0.3837 0.9146 0.9152
0.0444 12.0930 10400 0.3952 0.9158 0.9156
0.0444 12.3256 10600 0.4115 0.9132 0.9141
0.0444 12.5581 10800 0.4155 0.9108 0.9121
0.0444 12.7907 11000 0.4069 0.9139 0.9138
0.0413 13.0233 11200 0.4188 0.9130 0.9140
0.0413 13.2558 11400 0.4288 0.9113 0.9109
0.0413 13.4884 11600 0.4140 0.9149 0.9147
0.0413 13.7209 11800 0.4266 0.9165 0.9165
0.0413 13.9535 12000 0.4250 0.9108 0.9116
0.0347 14.1860 12200 0.4581 0.9117 0.9124
0.0347 14.4186 12400 0.4218 0.9133 0.9129
0.0347 14.6512 12600 0.4399 0.9118 0.9127
0.0347 14.8837 12800 0.4258 0.9131 0.9129
0.0324 15.1163 13000 0.4513 0.9142 0.9145
0.0324 15.3488 13200 0.4466 0.9132 0.9135
0.0324 15.5814 13400 0.4249 0.9150 0.9155
0.0324 15.8140 13600 0.4434 0.9128 0.9136
0.0307 16.0465 13800 0.4520 0.9133 0.9133
0.0307 16.2791 14000 0.4532 0.9142 0.9142
0.0307 16.5116 14200 0.4465 0.9140 0.9144
0.0307 16.7442 14400 0.4560 0.9152 0.9154
0.0307 16.9767 14600 0.4475 0.9141 0.9146
0.0278 17.2093 14800 0.4641 0.9148 0.9151
0.0278 17.4419 15000 0.4693 0.9125 0.9132
0.0278 17.6744 15200 0.4597 0.9131 0.9135
0.0278 17.9070 15400 0.4585 0.9143 0.9144
0.0259 18.1395 15600 0.4686 0.9131 0.9138
0.0259 18.3721 15800 0.4713 0.9144 0.9150
0.0259 18.6047 16000 0.4620 0.9134 0.9136
0.0259 18.8372 16200 0.4610 0.9142 0.9145
0.0232 19.0698 16400 0.4613 0.9152 0.9156
0.0232 19.3023 16600 0.4637 0.9151 0.9155
0.0232 19.5349 16800 0.4686 0.9143 0.9147
0.0232 19.7674 17000 0.4691 0.9146 0.9150
0.0214 20.0 17200 0.4681 0.9143 0.9147

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.1.2
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
135M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for gechim/PhoBertLexical-finetuned_70KURL_not_host

Finetuned
(184)
this model