Edit model card

Visualize in Weights & Biases Visualize in Weights & Biases

PhoBert_Lexical_Dataset51KBoDuoiWithNewLexical_15epoch

This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7848
  • Accuracy: 0.8402
  • F1: 0.8394

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.2445 200 0.5985 0.7254 0.7177
No log 0.4890 400 0.5662 0.7448 0.7339
No log 0.7335 600 0.5751 0.7506 0.7453
No log 0.9780 800 0.5838 0.7475 0.7442
0.3505 1.2225 1000 0.5789 0.7646 0.7597
0.3505 1.4670 1200 0.5104 0.7825 0.7796
0.3505 1.7115 1400 0.5660 0.7764 0.7708
0.3505 1.9560 1600 0.5603 0.7826 0.7764
0.2546 2.2005 1800 0.5996 0.7753 0.7732
0.2546 2.4450 2000 0.5646 0.7895 0.7829
0.2546 2.6895 2200 0.5774 0.7837 0.7791
0.2546 2.9340 2400 0.5415 0.7909 0.7873
0.2084 3.1785 2600 0.5848 0.7894 0.7866
0.2084 3.4230 2800 0.5447 0.8004 0.7960
0.2084 3.6675 3000 0.6179 0.7749 0.7729
0.2084 3.9120 3200 0.5892 0.7961 0.7945
0.1784 4.1565 3400 0.6669 0.7839 0.7830
0.1784 4.4010 3600 0.5824 0.7925 0.7913
0.1784 4.6455 3800 0.6170 0.8012 0.7974
0.1784 4.8900 4000 0.6482 0.7891 0.7891
0.1516 5.1345 4200 0.6269 0.8023 0.8006
0.1516 5.3790 4400 0.6487 0.8008 0.7990
0.1516 5.6235 4600 0.5999 0.8124 0.8094
0.1516 5.8680 4800 0.6239 0.8108 0.8091
0.1286 6.1125 5000 0.6551 0.8110 0.8094
0.1286 6.3570 5200 0.6106 0.8194 0.8155
0.1286 6.6015 5400 0.6491 0.8108 0.8110
0.1286 6.8460 5600 0.5692 0.8270 0.8247
0.11 7.0905 5800 0.6259 0.8263 0.8232
0.11 7.3350 6000 0.6865 0.8164 0.8154
0.11 7.5795 6200 0.7079 0.8170 0.8166
0.11 7.8240 6400 0.6968 0.8158 0.8152
0.0933 8.0685 6600 0.6568 0.8285 0.8265
0.0933 8.3130 6800 0.6513 0.8309 0.8297
0.0933 8.5575 7000 0.6665 0.8331 0.8317
0.0933 8.8020 7200 0.6384 0.8259 0.8244
0.0816 9.0465 7400 0.7175 0.8271 0.8264
0.0816 9.2910 7600 0.7558 0.8187 0.8185
0.0816 9.5355 7800 0.6997 0.8281 0.8269
0.0816 9.7800 8000 0.7126 0.8298 0.8290
0.0717 10.0244 8200 0.6864 0.8388 0.8377
0.0717 10.2689 8400 0.7268 0.8319 0.8311
0.0717 10.5134 8600 0.6949 0.8398 0.8388
0.0717 10.7579 8800 0.7236 0.8393 0.8386
0.063 11.0024 9000 0.6981 0.8396 0.8383
0.063 11.2469 9200 0.8012 0.8271 0.8273
0.063 11.4914 9400 0.7489 0.8283 0.8278
0.063 11.7359 9600 0.7293 0.8358 0.8349
0.063 11.9804 9800 0.7780 0.8336 0.8330
0.0554 12.2249 10000 0.7075 0.8457 0.8445
0.0554 12.4694 10200 0.7652 0.8370 0.8361
0.0554 12.7139 10400 0.7281 0.8453 0.8439
0.0554 12.9584 10600 0.7350 0.8471 0.8458
0.0508 13.2029 10800 0.7775 0.8396 0.8390
0.0508 13.4474 11000 0.7786 0.8385 0.8376
0.0508 13.6919 11200 0.7719 0.8400 0.8392
0.0508 13.9364 11400 0.7847 0.8356 0.8349
0.0431 14.1809 11600 0.7546 0.8446 0.8435
0.0431 14.4254 11800 0.7576 0.8482 0.8469
0.0431 14.6699 12000 0.7831 0.8391 0.8383
0.0431 14.9144 12200 0.7848 0.8402 0.8394

Framework versions

  • Transformers 4.43.1
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for ttqdunggg/PhoBert_Lexical_Dataset51KBoDuoiWithNewLexical_15epoch

Finetuned
(184)
this model