PhoBert_Lexical_lc / README.md
gechim's picture
End of training
b9a0671 verified
metadata
base_model: vinai/phobert-base-v2
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: PhoBert_Lexical_lc
    results: []

PhoBert_Lexical_lc

This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6002
  • Accuracy: 0.8324
  • F1: 0.8697

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.1927 200 0.7068 0.6753 0.7632
No log 0.3854 400 0.7343 0.6772 0.7646
No log 0.5780 600 0.5764 0.7641 0.8248
No log 0.7707 800 0.7871 0.6153 0.7178
No log 0.9634 1000 0.5685 0.7548 0.8186
0.358 1.1561 1200 0.6231 0.7569 0.8203
0.358 1.3487 1400 0.5796 0.7737 0.8314
0.358 1.5414 1600 0.5651 0.7758 0.8327
0.358 1.7341 1800 0.6171 0.7502 0.8157
0.358 1.9268 2000 0.5711 0.7645 0.8254
0.2472 2.1195 2200 0.6046 0.7615 0.8235
0.2472 2.3121 2400 0.8503 0.6871 0.7718
0.2472 2.5048 2600 0.7907 0.7136 0.7908
0.2472 2.6975 2800 0.6425 0.7575 0.8209
0.2472 2.8902 3000 0.5584 0.8067 0.8530
0.2065 3.0829 3200 0.6602 0.7627 0.8244
0.2065 3.2755 3400 0.7031 0.7570 0.8206
0.2065 3.4682 3600 0.6166 0.7832 0.8382
0.2065 3.6609 3800 0.7400 0.7279 0.8008
0.2065 3.8536 4000 0.5337 0.8066 0.8531
0.1757 4.0462 4200 0.7663 0.7600 0.8227
0.1757 4.2389 4400 0.6286 0.7849 0.8392
0.1757 4.4316 4600 0.6379 0.8031 0.8511
0.1757 4.6243 4800 0.6865 0.7751 0.8328
0.1757 4.8170 5000 0.5512 0.8216 0.8629
0.1511 5.0096 5200 0.6118 0.8058 0.8529
0.1511 5.2023 5400 0.8038 0.7545 0.8191
0.1511 5.3950 5600 0.6799 0.8170 0.8600
0.1511 5.5877 5800 0.8013 0.7679 0.8282
0.1511 5.7803 6000 0.7806 0.7809 0.8365
0.1511 5.9730 6200 0.7302 0.7738 0.8320
0.129 6.1657 6400 0.6002 0.8324 0.8697
0.129 6.3584 6600 0.7237 0.8069 0.8534
0.129 6.5511 6800 0.7118 0.8072 0.8536
0.129 6.7437 7000 0.7674 0.7933 0.8447
0.129 6.9364 7200 0.7735 0.7737 0.8319
0.1133 7.1291 7400 0.6940 0.8152 0.8588
0.1133 7.3218 7600 0.8333 0.7880 0.8413
0.1133 7.5145 7800 0.7050 0.8016 0.8502
0.1133 7.7071 8000 0.8503 0.7763 0.8336
0.1133 7.8998 8200 0.8677 0.7734 0.8318
0.0964 8.0925 8400 0.7368 0.7994 0.8488
0.0964 8.2852 8600 0.7291 0.8161 0.8594
0.0964 8.4778 8800 0.8928 0.7948 0.8457
0.0964 8.6705 9000 0.9070 0.7799 0.8360
0.0964 8.8632 9200 0.8584 0.7961 0.8465
0.085 9.0559 9400 0.8249 0.8081 0.8543
0.085 9.2486 9600 0.8202 0.7929 0.8446
0.085 9.4412 9800 0.9296 0.7757 0.8332
0.085 9.6339 10000 0.9153 0.7931 0.8447
0.085 9.8266 10200 0.9087 0.7868 0.8405
0.0749 10.0193 10400 0.8043 0.8054 0.8526
0.0749 10.2119 10600 0.9692 0.7916 0.8436
0.0749 10.4046 10800 0.8181 0.8190 0.8614
0.0749 10.5973 11000 0.8767 0.8010 0.8498
0.0749 10.7900 11200 0.9470 0.7944 0.8455
0.0749 10.9827 11400 0.9699 0.7796 0.8358
0.0668 11.1753 11600 0.9448 0.7862 0.8402
0.0668 11.3680 11800 0.9925 0.7982 0.8480
0.0668 11.5607 12000 1.0677 0.7826 0.8378
0.0668 11.7534 12200 0.8985 0.7994 0.8487
0.0668 11.9461 12400 0.9710 0.7969 0.8471
0.0601 12.1387 12600 1.0032 0.7924 0.8442
0.0601 12.3314 12800 1.0084 0.7911 0.8432
0.0601 12.5241 13000 1.1361 0.7666 0.8272
0.0601 12.7168 13200 0.9933 0.7935 0.8449
0.0601 12.9094 13400 1.0405 0.7888 0.8419
0.0528 13.1021 13600 1.0769 0.7822 0.8375
0.0528 13.2948 13800 1.0596 0.7906 0.8431
0.0528 13.4875 14000 1.0612 0.7848 0.8393
0.0528 13.6802 14200 1.0330 0.7909 0.8434
0.0528 13.8728 14400 1.0386 0.7967 0.8471
0.0477 14.0655 14600 0.9948 0.7956 0.8464
0.0477 14.2582 14800 1.0767 0.7897 0.8425
0.0477 14.4509 15000 1.0176 0.7938 0.8451
0.0477 14.6435 15200 1.0246 0.7945 0.8456
0.0477 14.8362 15400 1.0230 0.7969 0.8472

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1