File size: 6,455 Bytes
ef02fc7
c11dcce
ef02fc7
 
 
 
 
 
 
c11dcce
 
ef02fc7
 
 
 
 
 
 
 
 
 
 
 
c486c98
 
 
 
 
ef02fc7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c486c98
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ef02fc7
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
---
base_model: haryoaw/scenario-TCR-NER_data-univner_half
library_name: transformers
license: mit
metrics:
- precision
- recall
- f1
- accuracy
tags:
- generated_from_trainer
model-index:
- name: scenario-non-kd-po-ner-full-mdeberta_data-univner_half55
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# scenario-non-kd-po-ner-full-mdeberta_data-univner_half55

This model is a fine-tuned version of [haryoaw/scenario-TCR-NER_data-univner_half](https://huggingface.co/haryoaw/scenario-TCR-NER_data-univner_half) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1857
- Precision: 0.7753
- Recall: 0.7967
- F1: 0.7859
- Accuracy: 0.9781

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 55
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch   | Step  | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.2478        | 0.5828  | 500   | 0.1339          | 0.4725    | 0.6165 | 0.5350 | 0.9540   |
| 0.1025        | 1.1655  | 1000  | 0.1049          | 0.6273    | 0.7213 | 0.6710 | 0.9664   |
| 0.071         | 1.7483  | 1500  | 0.0967          | 0.6465    | 0.7769 | 0.7058 | 0.9698   |
| 0.048         | 2.3310  | 2000  | 0.0946          | 0.7123    | 0.7788 | 0.7441 | 0.9744   |
| 0.0392        | 2.9138  | 2500  | 0.0989          | 0.7245    | 0.7705 | 0.7467 | 0.9745   |
| 0.0266        | 3.4965  | 3000  | 0.1055          | 0.7467    | 0.7653 | 0.7559 | 0.9759   |
| 0.0247        | 4.0793  | 3500  | 0.1095          | 0.7402    | 0.7798 | 0.7595 | 0.9758   |
| 0.0171        | 4.6620  | 4000  | 0.1124          | 0.7518    | 0.7842 | 0.7677 | 0.9765   |
| 0.0156        | 5.2448  | 4500  | 0.1127          | 0.7432    | 0.7945 | 0.7680 | 0.9767   |
| 0.013         | 5.8275  | 5000  | 0.1167          | 0.7529    | 0.7787 | 0.7656 | 0.9760   |
| 0.01          | 6.4103  | 5500  | 0.1262          | 0.7467    | 0.7909 | 0.7682 | 0.9763   |
| 0.01          | 6.9930  | 6000  | 0.1357          | 0.7492    | 0.7807 | 0.7646 | 0.9757   |
| 0.0076        | 7.5758  | 6500  | 0.1310          | 0.7500    | 0.7940 | 0.7714 | 0.9767   |
| 0.007         | 8.1585  | 7000  | 0.1412          | 0.7446    | 0.7882 | 0.7658 | 0.9761   |
| 0.006         | 8.7413  | 7500  | 0.1484          | 0.7539    | 0.7821 | 0.7677 | 0.9768   |
| 0.0056        | 9.3240  | 8000  | 0.1475          | 0.7769    | 0.7693 | 0.7731 | 0.9775   |
| 0.0053        | 9.9068  | 8500  | 0.1399          | 0.7633    | 0.7794 | 0.7713 | 0.9773   |
| 0.0048        | 10.4895 | 9000  | 0.1433          | 0.7526    | 0.7938 | 0.7726 | 0.9767   |
| 0.004         | 11.0723 | 9500  | 0.1476          | 0.7731    | 0.7842 | 0.7786 | 0.9774   |
| 0.0034        | 11.6550 | 10000 | 0.1473          | 0.7671    | 0.7917 | 0.7792 | 0.9774   |
| 0.0036        | 12.2378 | 10500 | 0.1570          | 0.7512    | 0.7915 | 0.7708 | 0.9765   |
| 0.003         | 12.8205 | 11000 | 0.1534          | 0.7689    | 0.7807 | 0.7748 | 0.9772   |
| 0.0025        | 13.4033 | 11500 | 0.1633          | 0.7748    | 0.7833 | 0.7790 | 0.9772   |
| 0.0029        | 13.9860 | 12000 | 0.1590          | 0.7561    | 0.7938 | 0.7745 | 0.9768   |
| 0.0027        | 14.5688 | 12500 | 0.1614          | 0.7598    | 0.8035 | 0.7810 | 0.9775   |
| 0.0023        | 15.1515 | 13000 | 0.1629          | 0.7731    | 0.7947 | 0.7837 | 0.9780   |
| 0.0022        | 15.7343 | 13500 | 0.1582          | 0.7776    | 0.7908 | 0.7841 | 0.9779   |
| 0.0019        | 16.3170 | 14000 | 0.1699          | 0.7605    | 0.8020 | 0.7807 | 0.9776   |
| 0.002         | 16.8998 | 14500 | 0.1602          | 0.7670    | 0.7898 | 0.7782 | 0.9773   |
| 0.0015        | 17.4825 | 15000 | 0.1716          | 0.7690    | 0.7956 | 0.7821 | 0.9779   |
| 0.0017        | 18.0653 | 15500 | 0.1721          | 0.7653    | 0.7917 | 0.7782 | 0.9772   |
| 0.0013        | 18.6480 | 16000 | 0.1743          | 0.7724    | 0.7902 | 0.7812 | 0.9778   |
| 0.0014        | 19.2308 | 16500 | 0.1756          | 0.7625    | 0.8029 | 0.7822 | 0.9775   |
| 0.0013        | 19.8135 | 17000 | 0.1800          | 0.7598    | 0.8003 | 0.7795 | 0.9768   |
| 0.0009        | 20.3963 | 17500 | 0.1737          | 0.7718    | 0.7957 | 0.7835 | 0.9779   |
| 0.0012        | 20.9790 | 18000 | 0.1709          | 0.7628    | 0.7993 | 0.7806 | 0.9775   |
| 0.0009        | 21.5618 | 18500 | 0.1824          | 0.7721    | 0.8003 | 0.7860 | 0.9778   |
| 0.0008        | 22.1445 | 19000 | 0.1859          | 0.7578    | 0.8005 | 0.7786 | 0.9772   |
| 0.0008        | 22.7273 | 19500 | 0.1793          | 0.7774    | 0.7914 | 0.7843 | 0.9781   |
| 0.0009        | 23.3100 | 20000 | 0.1797          | 0.7713    | 0.7908 | 0.7809 | 0.9780   |
| 0.0007        | 23.8928 | 20500 | 0.1811          | 0.7616    | 0.7989 | 0.7798 | 0.9777   |
| 0.0006        | 24.4755 | 21000 | 0.1831          | 0.7811    | 0.7889 | 0.7850 | 0.9782   |
| 0.0006        | 25.0583 | 21500 | 0.1874          | 0.7644    | 0.8006 | 0.7821 | 0.9776   |
| 0.0006        | 25.6410 | 22000 | 0.1840          | 0.7709    | 0.7937 | 0.7821 | 0.9777   |
| 0.0005        | 26.2238 | 22500 | 0.1867          | 0.7719    | 0.7989 | 0.7852 | 0.9781   |
| 0.0005        | 26.8065 | 23000 | 0.1826          | 0.7781    | 0.7878 | 0.7829 | 0.9782   |
| 0.0004        | 27.3893 | 23500 | 0.1865          | 0.7719    | 0.8018 | 0.7866 | 0.9782   |
| 0.0005        | 27.9720 | 24000 | 0.1871          | 0.7733    | 0.7987 | 0.7858 | 0.9781   |
| 0.0004        | 28.5548 | 24500 | 0.1856          | 0.7770    | 0.7973 | 0.7870 | 0.9784   |
| 0.0004        | 29.1375 | 25000 | 0.1854          | 0.7728    | 0.7967 | 0.7846 | 0.9781   |
| 0.0003        | 29.7203 | 25500 | 0.1857          | 0.7753    | 0.7967 | 0.7859 | 0.9781   |


### Framework versions

- Transformers 4.44.2
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.19.1