File size: 6,401 Bytes
82c3fa3
20c6de1
82c3fa3
 
 
 
 
 
 
20c6de1
 
82c3fa3
 
 
 
 
 
 
 
 
 
c7e048d
82c3fa3
a556fb9
 
 
 
c7e048d
82c3fa3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a556fb9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
82c3fa3
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
---
base_model: FacebookAI/xlm-roberta-base
library_name: transformers
license: mit
metrics:
- precision
- recall
- f1
- accuracy
tags:
- generated_from_trainer
model-index:
- name: scenario-non-kd-pre-ner-full-xlmr_data-univner_half44
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# scenario-non-kd-pre-ner-full-xlmr_data-univner_half44

This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1667
- Precision: 0.8
- Recall: 0.8085
- F1: 0.8042
- Accuracy: 0.9794

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 44
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch   | Step  | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.1407        | 0.5828  | 500   | 0.0805          | 0.7146    | 0.7383 | 0.7262 | 0.9736   |
| 0.0685        | 1.1655  | 1000  | 0.0771          | 0.7453    | 0.7904 | 0.7672 | 0.9766   |
| 0.0504        | 1.7483  | 1500  | 0.0763          | 0.7569    | 0.7948 | 0.7754 | 0.9778   |
| 0.0384        | 2.3310  | 2000  | 0.0867          | 0.7371    | 0.7931 | 0.7641 | 0.9760   |
| 0.0306        | 2.9138  | 2500  | 0.0880          | 0.7501    | 0.8074 | 0.7777 | 0.9768   |
| 0.0223        | 3.4965  | 3000  | 0.0928          | 0.7585    | 0.8097 | 0.7833 | 0.9775   |
| 0.0202        | 4.0793  | 3500  | 0.0958          | 0.7641    | 0.7971 | 0.7803 | 0.9777   |
| 0.0151        | 4.6620  | 4000  | 0.0985          | 0.7690    | 0.8044 | 0.7863 | 0.9778   |
| 0.0134        | 5.2448  | 4500  | 0.1051          | 0.7857    | 0.7963 | 0.7910 | 0.9787   |
| 0.0112        | 5.8275  | 5000  | 0.1080          | 0.7677    | 0.8016 | 0.7843 | 0.9786   |
| 0.0096        | 6.4103  | 5500  | 0.1158          | 0.7698    | 0.8083 | 0.7886 | 0.9781   |
| 0.0092        | 6.9930  | 6000  | 0.1130          | 0.7857    | 0.8009 | 0.7932 | 0.9783   |
| 0.0074        | 7.5758  | 6500  | 0.1161          | 0.7749    | 0.8068 | 0.7906 | 0.9785   |
| 0.0067        | 8.1585  | 7000  | 0.1194          | 0.7887    | 0.7922 | 0.7905 | 0.9783   |
| 0.0058        | 8.7413  | 7500  | 0.1179          | 0.7796    | 0.8129 | 0.7959 | 0.9786   |
| 0.0053        | 9.3240  | 8000  | 0.1266          | 0.7824    | 0.8088 | 0.7954 | 0.9782   |
| 0.0049        | 9.9068  | 8500  | 0.1273          | 0.7858    | 0.7960 | 0.7909 | 0.9786   |
| 0.0043        | 10.4895 | 9000  | 0.1301          | 0.7965    | 0.7977 | 0.7971 | 0.9789   |
| 0.0041        | 11.0723 | 9500  | 0.1289          | 0.7992    | 0.7941 | 0.7966 | 0.9784   |
| 0.0035        | 11.6550 | 10000 | 0.1344          | 0.7904    | 0.8088 | 0.7995 | 0.9786   |
| 0.0033        | 12.2378 | 10500 | 0.1391          | 0.7889    | 0.8032 | 0.7960 | 0.9786   |
| 0.0032        | 12.8205 | 11000 | 0.1431          | 0.7642    | 0.8096 | 0.7862 | 0.9777   |
| 0.0029        | 13.4033 | 11500 | 0.1359          | 0.8006    | 0.7969 | 0.7987 | 0.9787   |
| 0.0028        | 13.9860 | 12000 | 0.1393          | 0.7874    | 0.8137 | 0.8003 | 0.9790   |
| 0.0023        | 14.5688 | 12500 | 0.1426          | 0.7907    | 0.8012 | 0.7959 | 0.9787   |
| 0.0022        | 15.1515 | 13000 | 0.1441          | 0.7945    | 0.8096 | 0.8020 | 0.9791   |
| 0.002         | 15.7343 | 13500 | 0.1498          | 0.7860    | 0.8041 | 0.7950 | 0.9782   |
| 0.0023        | 16.3170 | 14000 | 0.1442          | 0.7844    | 0.8090 | 0.7965 | 0.9787   |
| 0.0016        | 16.8998 | 14500 | 0.1534          | 0.7943    | 0.8080 | 0.8011 | 0.9789   |
| 0.0017        | 17.4825 | 15000 | 0.1483          | 0.7915    | 0.8019 | 0.7967 | 0.9789   |
| 0.0017        | 18.0653 | 15500 | 0.1521          | 0.8066    | 0.7932 | 0.7999 | 0.9791   |
| 0.0014        | 18.6480 | 16000 | 0.1517          | 0.7984    | 0.8022 | 0.8003 | 0.9792   |
| 0.0014        | 19.2308 | 16500 | 0.1549          | 0.7820    | 0.8136 | 0.7975 | 0.9789   |
| 0.0011        | 19.8135 | 17000 | 0.1546          | 0.7980    | 0.8035 | 0.8007 | 0.9792   |
| 0.0012        | 20.3963 | 17500 | 0.1601          | 0.7842    | 0.8062 | 0.7950 | 0.9785   |
| 0.0011        | 20.9790 | 18000 | 0.1596          | 0.7830    | 0.8012 | 0.7920 | 0.9785   |
| 0.0009        | 21.5618 | 18500 | 0.1616          | 0.7911    | 0.8116 | 0.8012 | 0.9788   |
| 0.0012        | 22.1445 | 19000 | 0.1617          | 0.7834    | 0.8087 | 0.7958 | 0.9782   |
| 0.0008        | 22.7273 | 19500 | 0.1621          | 0.7917    | 0.8104 | 0.8009 | 0.9792   |
| 0.0009        | 23.3100 | 20000 | 0.1637          | 0.7927    | 0.8010 | 0.7968 | 0.9786   |
| 0.0007        | 23.8928 | 20500 | 0.1637          | 0.7720    | 0.8126 | 0.7918 | 0.9784   |
| 0.0006        | 24.4755 | 21000 | 0.1628          | 0.8003    | 0.7995 | 0.7999 | 0.9791   |
| 0.0007        | 25.0583 | 21500 | 0.1635          | 0.7904    | 0.8094 | 0.7998 | 0.9789   |
| 0.0005        | 25.6410 | 22000 | 0.1651          | 0.7942    | 0.8121 | 0.8031 | 0.9793   |
| 0.0005        | 26.2238 | 22500 | 0.1652          | 0.7958    | 0.8114 | 0.8035 | 0.9792   |
| 0.0005        | 26.8065 | 23000 | 0.1653          | 0.7936    | 0.8087 | 0.8011 | 0.9792   |
| 0.0005        | 27.3893 | 23500 | 0.1661          | 0.8005    | 0.8097 | 0.8050 | 0.9794   |
| 0.0004        | 27.9720 | 24000 | 0.1661          | 0.7953    | 0.8100 | 0.8026 | 0.9792   |
| 0.0004        | 28.5548 | 24500 | 0.1668          | 0.7940    | 0.8108 | 0.8023 | 0.9793   |
| 0.0004        | 29.1375 | 25000 | 0.1666          | 0.7944    | 0.8081 | 0.8012 | 0.9791   |
| 0.0003        | 29.7203 | 25500 | 0.1667          | 0.8       | 0.8085 | 0.8042 | 0.9794   |


### Framework versions

- Transformers 4.44.2
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.19.1