haryoaw's picture
Initial Commit
e7354ad verified
|
raw
history blame
3.79 kB
metadata
library_name: transformers
license: mit
base_model: haryoaw/scenario-TCR-NER_data-univner_half
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: scenario-non-kd-po-ner-full-xlmr_data-univner_half66
    results: []

scenario-non-kd-po-ner-full-xlmr_data-univner_half66

This model is a fine-tuned version of haryoaw/scenario-TCR-NER_data-univner_half on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1331
  • Precision: 0.8406
  • Recall: 0.8491
  • F1: 0.8448
  • Accuracy: 0.9836

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 66
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
0.0102 0.5828 500 0.0979 0.8297 0.8550 0.8422 0.9832
0.0109 1.1655 1000 0.0889 0.8346 0.8466 0.8406 0.9835
0.0084 1.7483 1500 0.0932 0.8491 0.8462 0.8477 0.9839
0.0075 2.3310 2000 0.0919 0.8434 0.8437 0.8436 0.9835
0.0072 2.9138 2500 0.1043 0.8278 0.8380 0.8329 0.9826
0.0058 3.4965 3000 0.1020 0.8370 0.8468 0.8419 0.9832
0.0059 4.0793 3500 0.1030 0.8467 0.8458 0.8463 0.9839
0.005 4.6620 4000 0.1182 0.8492 0.8326 0.8408 0.9830
0.005 5.2448 4500 0.1141 0.8235 0.8510 0.8370 0.9821
0.0044 5.8275 5000 0.1184 0.8273 0.8572 0.8420 0.9828
0.0045 6.4103 5500 0.1213 0.8417 0.8494 0.8455 0.9833
0.0048 6.9930 6000 0.1126 0.8413 0.8413 0.8413 0.9835
0.0043 7.5758 6500 0.1240 0.8363 0.8472 0.8417 0.9830
0.0036 8.1585 7000 0.1185 0.8470 0.8450 0.8460 0.9838
0.0032 8.7413 7500 0.1249 0.8338 0.8391 0.8365 0.9828
0.0022 9.3240 8000 0.1260 0.8351 0.8499 0.8425 0.9835
0.003 9.9068 8500 0.1208 0.8273 0.8420 0.8346 0.9827
0.0024 10.4895 9000 0.1216 0.8451 0.8463 0.8457 0.9836
0.0027 11.0723 9500 0.1214 0.8410 0.8390 0.8400 0.9832
0.0019 11.6550 10000 0.1234 0.8419 0.8458 0.8438 0.9833
0.0023 12.2378 10500 0.1289 0.8339 0.8502 0.8420 0.9830
0.0019 12.8205 11000 0.1286 0.8400 0.8401 0.8401 0.9832
0.002 13.4033 11500 0.1331 0.8406 0.8491 0.8448 0.9836

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.14.5
  • Tokenizers 0.19.1