haryoaw's picture
Initial Commit
a27ed2d verified
|
raw
history blame
6.43 kB
metadata
library_name: transformers
license: mit
base_model: haryoaw/scenario-TCR-NER_data-univner_half
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: scenario-kd-scr-ner-full_data-univner_full55
    results: []

scenario-kd-scr-ner-full_data-univner_full55

This model is a fine-tuned version of haryoaw/scenario-TCR-NER_data-univner_half on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6332
  • Precision: 0.4469
  • Recall: 0.3758
  • F1: 0.4083
  • Accuracy: 0.9390

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 55
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
2.9172 0.5828 500 2.8507 0.2157 0.0452 0.0747 0.9231
2.2157 1.1655 1000 2.5207 0.2360 0.1134 0.1532 0.9230
1.9564 1.7483 1500 2.5233 0.1706 0.1749 0.1727 0.9128
1.7543 2.3310 2000 2.4414 0.2175 0.2244 0.2209 0.9157
1.6286 2.9138 2500 2.2528 0.2500 0.2336 0.2415 0.9223
1.4766 3.4965 3000 2.0896 0.2944 0.2241 0.2545 0.9279
1.398 4.0793 3500 2.0471 0.3335 0.2441 0.2819 0.9303
1.2907 4.6620 4000 1.9739 0.2985 0.2568 0.2761 0.9294
1.2065 5.2448 4500 1.8564 0.3685 0.2424 0.2924 0.9344
1.1392 5.8275 5000 2.1380 0.2515 0.3037 0.2751 0.9172
1.0459 6.4103 5500 1.9090 0.3426 0.2819 0.3093 0.9320
0.9973 6.9930 6000 1.8167 0.3556 0.3015 0.3263 0.9350
0.9106 7.5758 6500 1.8701 0.3736 0.2884 0.3255 0.9326
0.8843 8.1585 7000 1.8193 0.3618 0.3219 0.3407 0.9345
0.8329 8.7413 7500 1.8722 0.3634 0.3378 0.3501 0.9305
0.784 9.3240 8000 1.7434 0.4139 0.3140 0.3571 0.9381
0.7606 9.9068 8500 1.7787 0.4143 0.3147 0.3577 0.9363
0.7111 10.4895 9000 1.8461 0.3518 0.3292 0.3401 0.9315
0.6894 11.0723 9500 1.7537 0.3635 0.3327 0.3474 0.9351
0.6543 11.6550 10000 1.7565 0.3779 0.3506 0.3637 0.9347
0.6429 12.2378 10500 1.8134 0.3769 0.3496 0.3627 0.9323
0.6084 12.8205 11000 1.8020 0.3757 0.3740 0.3748 0.9320
0.5799 13.4033 11500 1.7080 0.4119 0.3447 0.3753 0.9374
0.5742 13.9860 12000 1.7454 0.3963 0.3668 0.3809 0.9356
0.5467 14.5688 12500 1.8019 0.3832 0.3748 0.3790 0.9322
0.5327 15.1515 13000 1.8784 0.3599 0.3774 0.3685 0.9275
0.5207 15.7343 13500 1.7905 0.3977 0.3760 0.3865 0.9336
0.5047 16.3170 14000 1.6909 0.4336 0.3606 0.3937 0.9377
0.4911 16.8998 14500 1.7464 0.3951 0.3780 0.3864 0.9342
0.4802 17.4825 15000 1.7247 0.4230 0.3738 0.3969 0.9365
0.4729 18.0653 15500 1.6929 0.4307 0.3639 0.3945 0.9379
0.4607 18.6480 16000 1.6395 0.4493 0.3503 0.3937 0.9404
0.449 19.2308 16500 1.7051 0.4149 0.3766 0.3948 0.9362
0.4402 19.8135 17000 1.7664 0.4024 0.3779 0.3898 0.9318
0.4337 20.3963 17500 1.6884 0.4475 0.3689 0.4044 0.9386
0.4272 20.9790 18000 1.6995 0.4209 0.3841 0.4017 0.9360
0.4162 21.5618 18500 1.6522 0.4428 0.3668 0.4012 0.9387
0.4114 22.1445 19000 1.6957 0.4082 0.3797 0.3935 0.9356
0.4087 22.7273 19500 1.6728 0.4323 0.3656 0.3962 0.9377
0.4008 23.3100 20000 1.6749 0.4287 0.3598 0.3913 0.9368
0.394 23.8928 20500 1.6745 0.4266 0.3640 0.3928 0.9373
0.3887 24.4755 21000 1.6553 0.4358 0.3666 0.3982 0.9386
0.3876 25.0583 21500 1.6904 0.4190 0.3841 0.4008 0.9363
0.3819 25.6410 22000 1.6581 0.4360 0.3761 0.4039 0.9372
0.3776 26.2238 22500 1.6192 0.4595 0.3620 0.4050 0.9401
0.3767 26.8065 23000 1.6383 0.4453 0.3796 0.4098 0.9386
0.3738 27.3893 23500 1.6327 0.4517 0.3745 0.4095 0.9396
0.3671 27.9720 24000 1.6605 0.4399 0.3763 0.4056 0.9378
0.3694 28.5548 24500 1.6160 0.4554 0.3744 0.4110 0.9402
0.3659 29.1375 25000 1.6376 0.4419 0.3734 0.4048 0.9383
0.3637 29.7203 25500 1.6332 0.4469 0.3758 0.4083 0.9390

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.14.5
  • Tokenizers 0.19.1