haryoaw's picture
Upload tokenizer
13c32cc verified
|
raw
history blame
3.49 kB
metadata
base_model: haryoaw/scenario-TCR-NER_data-univner_en
library_name: transformers
license: mit
metrics:
  - precision
  - recall
  - f1
  - accuracy
tags:
  - generated_from_trainer
model-index:
  - name: scenario-kd-scr-ner-full_data-univner_full55
    results: []

scenario-kd-scr-ner-full_data-univner_full55

This model is a fine-tuned version of haryoaw/scenario-TCR-NER_data-univner_en on the None dataset. It achieves the following results on the evaluation set:

  • Loss: nan
  • Precision: 0.0
  • Recall: 0.0
  • F1: 0.0
  • Accuracy: 0.9405

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 55
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
5.3686 1.2755 500 nan 0.0 0.0 0.0 0.9405
0.0 2.5510 1000 nan 0.0 0.0 0.0 0.9405
0.0 3.8265 1500 nan 0.0 0.0 0.0 0.9405
0.0 5.1020 2000 nan 0.0 0.0 0.0 0.9405
0.0 6.3776 2500 nan 0.0 0.0 0.0 0.9405
0.0 7.6531 3000 nan 0.0 0.0 0.0 0.9405
0.0 8.9286 3500 nan 0.0 0.0 0.0 0.9405
0.0 10.2041 4000 nan 0.0 0.0 0.0 0.9405
0.0 11.4796 4500 nan 0.0 0.0 0.0 0.9405
0.0 12.7551 5000 nan 0.0 0.0 0.0 0.9405
0.0 14.0306 5500 nan 0.0 0.0 0.0 0.9405
0.0 15.3061 6000 nan 0.0 0.0 0.0 0.9405
0.0 16.5816 6500 nan 0.0 0.0 0.0 0.9405
0.0 17.8571 7000 nan 0.0 0.0 0.0 0.9405
0.0 19.1327 7500 nan 0.0 0.0 0.0 0.9405
0.0 20.4082 8000 nan 0.0 0.0 0.0 0.9405
0.0 21.6837 8500 nan 0.0 0.0 0.0 0.9405
0.0 22.9592 9000 nan 0.0 0.0 0.0 0.9405
0.0 24.2347 9500 nan 0.0 0.0 0.0 0.9405
0.0 25.5102 10000 nan 0.0 0.0 0.0 0.9405
0.0 26.7857 10500 nan 0.0 0.0 0.0 0.9405

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.14.5
  • Tokenizers 0.19.1