haryoaw's picture
Upload tokenizer
8c352ea verified
---
base_model: FacebookAI/xlm-roberta-base
library_name: transformers
license: mit
metrics:
- precision
- recall
- f1
- accuracy
tags:
- generated_from_trainer
model-index:
- name: scenario-non-kd-scr-ner-full-xlmr_data-univner_full44
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# scenario-non-kd-scr-ner-full-xlmr_data-univner_full44
This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3770
- Precision: 0.5737
- Recall: 0.5773
- F1: 0.5755
- Accuracy: 0.9597
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 44
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.3376 | 0.2910 | 500 | 0.2801 | 0.4349 | 0.1412 | 0.2132 | 0.9307 |
| 0.2635 | 0.5821 | 1000 | 0.2441 | 0.3323 | 0.2249 | 0.2683 | 0.9343 |
| 0.2296 | 0.8731 | 1500 | 0.2240 | 0.3293 | 0.2759 | 0.3002 | 0.9364 |
| 0.2009 | 1.1641 | 2000 | 0.2151 | 0.3590 | 0.3569 | 0.3580 | 0.9372 |
| 0.1874 | 1.4552 | 2500 | 0.2115 | 0.3755 | 0.3354 | 0.3543 | 0.9398 |
| 0.1754 | 1.7462 | 3000 | 0.2126 | 0.3774 | 0.3173 | 0.3448 | 0.9420 |
| 0.1643 | 2.0373 | 3500 | 0.2050 | 0.3776 | 0.3711 | 0.3743 | 0.9412 |
| 0.1368 | 2.3283 | 4000 | 0.2107 | 0.4135 | 0.3636 | 0.3869 | 0.9435 |
| 0.1361 | 2.6193 | 4500 | 0.1976 | 0.4033 | 0.4125 | 0.4078 | 0.9449 |
| 0.1255 | 2.9104 | 5000 | 0.1890 | 0.4470 | 0.4359 | 0.4413 | 0.9470 |
| 0.105 | 3.2014 | 5500 | 0.1942 | 0.4691 | 0.4675 | 0.4683 | 0.9494 |
| 0.0946 | 3.4924 | 6000 | 0.1912 | 0.4519 | 0.4515 | 0.4517 | 0.9491 |
| 0.0875 | 3.7835 | 6500 | 0.1861 | 0.4874 | 0.4696 | 0.4784 | 0.9516 |
| 0.079 | 4.0745 | 7000 | 0.1950 | 0.4950 | 0.5017 | 0.4983 | 0.9535 |
| 0.0641 | 4.3655 | 7500 | 0.1965 | 0.5102 | 0.5058 | 0.5080 | 0.9545 |
| 0.0627 | 4.6566 | 8000 | 0.1870 | 0.4881 | 0.5327 | 0.5094 | 0.9535 |
| 0.06 | 4.9476 | 8500 | 0.1986 | 0.5151 | 0.5056 | 0.5103 | 0.9550 |
| 0.0467 | 5.2386 | 9000 | 0.2012 | 0.5317 | 0.5217 | 0.5267 | 0.9552 |
| 0.0443 | 5.5297 | 9500 | 0.2116 | 0.5339 | 0.5125 | 0.5230 | 0.9552 |
| 0.0446 | 5.8207 | 10000 | 0.2037 | 0.5268 | 0.5232 | 0.5250 | 0.9556 |
| 0.0401 | 6.1118 | 10500 | 0.2191 | 0.5186 | 0.5510 | 0.5343 | 0.9563 |
| 0.0313 | 6.4028 | 11000 | 0.2224 | 0.5503 | 0.5269 | 0.5384 | 0.9569 |
| 0.0318 | 6.6938 | 11500 | 0.2233 | 0.5035 | 0.5485 | 0.5251 | 0.9546 |
| 0.0299 | 6.9849 | 12000 | 0.2302 | 0.5646 | 0.5056 | 0.5335 | 0.9566 |
| 0.0237 | 7.2759 | 12500 | 0.2427 | 0.5342 | 0.5403 | 0.5372 | 0.9564 |
| 0.0235 | 7.5669 | 13000 | 0.2487 | 0.5049 | 0.5425 | 0.5230 | 0.9557 |
| 0.0226 | 7.8580 | 13500 | 0.2501 | 0.5431 | 0.5325 | 0.5378 | 0.9569 |
| 0.0193 | 8.1490 | 14000 | 0.2425 | 0.5252 | 0.5604 | 0.5422 | 0.9567 |
| 0.0169 | 8.4400 | 14500 | 0.2520 | 0.5446 | 0.5423 | 0.5435 | 0.9568 |
| 0.0174 | 8.7311 | 15000 | 0.2516 | 0.5351 | 0.5725 | 0.5532 | 0.9570 |
| 0.0167 | 9.0221 | 15500 | 0.2772 | 0.5618 | 0.5335 | 0.5473 | 0.9581 |
| 0.0128 | 9.3132 | 16000 | 0.2577 | 0.5349 | 0.5754 | 0.5544 | 0.9575 |
| 0.013 | 9.6042 | 16500 | 0.2834 | 0.5483 | 0.5392 | 0.5437 | 0.9579 |
| 0.0129 | 9.8952 | 17000 | 0.2734 | 0.5573 | 0.5659 | 0.5615 | 0.9583 |
| 0.0108 | 10.1863 | 17500 | 0.2819 | 0.5346 | 0.5804 | 0.5566 | 0.9575 |
| 0.0096 | 10.4773 | 18000 | 0.3010 | 0.5129 | 0.5861 | 0.5471 | 0.9558 |
| 0.011 | 10.7683 | 18500 | 0.2832 | 0.5315 | 0.5859 | 0.5574 | 0.9576 |
| 0.009 | 11.0594 | 19000 | 0.3044 | 0.5479 | 0.5636 | 0.5556 | 0.9581 |
| 0.008 | 11.3504 | 19500 | 0.2994 | 0.5418 | 0.5800 | 0.5602 | 0.9578 |
| 0.0082 | 11.6414 | 20000 | 0.2879 | 0.5529 | 0.5640 | 0.5584 | 0.9579 |
| 0.0086 | 11.9325 | 20500 | 0.3122 | 0.5410 | 0.5467 | 0.5438 | 0.9578 |
| 0.0067 | 12.2235 | 21000 | 0.3093 | 0.5531 | 0.5676 | 0.5603 | 0.9586 |
| 0.0067 | 12.5146 | 21500 | 0.3113 | 0.5446 | 0.5644 | 0.5543 | 0.9580 |
| 0.0063 | 12.8056 | 22000 | 0.3014 | 0.5501 | 0.5813 | 0.5653 | 0.9580 |
| 0.0061 | 13.0966 | 22500 | 0.3200 | 0.5451 | 0.5610 | 0.5529 | 0.9582 |
| 0.0052 | 13.3877 | 23000 | 0.3071 | 0.5495 | 0.5659 | 0.5576 | 0.9582 |
| 0.0052 | 13.6787 | 23500 | 0.3079 | 0.5647 | 0.5640 | 0.5644 | 0.9586 |
| 0.0052 | 13.9697 | 24000 | 0.3142 | 0.5406 | 0.5750 | 0.5572 | 0.9583 |
| 0.004 | 14.2608 | 24500 | 0.3146 | 0.5610 | 0.5719 | 0.5664 | 0.9588 |
| 0.0039 | 14.5518 | 25000 | 0.3268 | 0.5504 | 0.5712 | 0.5606 | 0.9587 |
| 0.0045 | 14.8428 | 25500 | 0.3133 | 0.5569 | 0.5713 | 0.5640 | 0.9588 |
| 0.0043 | 15.1339 | 26000 | 0.3308 | 0.5599 | 0.5575 | 0.5587 | 0.9587 |
| 0.0031 | 15.4249 | 26500 | 0.3380 | 0.5493 | 0.5638 | 0.5565 | 0.9580 |
| 0.0035 | 15.7159 | 27000 | 0.3410 | 0.5559 | 0.5462 | 0.5510 | 0.9579 |
| 0.0033 | 16.0070 | 27500 | 0.3326 | 0.5550 | 0.5709 | 0.5628 | 0.9585 |
| 0.0028 | 16.2980 | 28000 | 0.3400 | 0.5580 | 0.5751 | 0.5664 | 0.9590 |
| 0.003 | 16.5891 | 28500 | 0.3418 | 0.5601 | 0.5624 | 0.5612 | 0.9586 |
| 0.003 | 16.8801 | 29000 | 0.3340 | 0.5394 | 0.5874 | 0.5624 | 0.9577 |
| 0.0029 | 17.1711 | 29500 | 0.3431 | 0.5511 | 0.5915 | 0.5706 | 0.9589 |
| 0.0025 | 17.4622 | 30000 | 0.3326 | 0.5545 | 0.5832 | 0.5685 | 0.9583 |
| 0.0023 | 17.7532 | 30500 | 0.3492 | 0.5364 | 0.5892 | 0.5616 | 0.9581 |
| 0.0025 | 18.0442 | 31000 | 0.3481 | 0.5655 | 0.5695 | 0.5675 | 0.9586 |
| 0.002 | 18.3353 | 31500 | 0.3450 | 0.5542 | 0.5768 | 0.5653 | 0.9585 |
| 0.0023 | 18.6263 | 32000 | 0.3443 | 0.5611 | 0.5700 | 0.5655 | 0.9589 |
| 0.0021 | 18.9173 | 32500 | 0.3484 | 0.5567 | 0.5871 | 0.5715 | 0.9588 |
| 0.002 | 19.2084 | 33000 | 0.3528 | 0.5675 | 0.5611 | 0.5643 | 0.9584 |
| 0.0018 | 19.4994 | 33500 | 0.3496 | 0.5511 | 0.5765 | 0.5635 | 0.9583 |
| 0.0019 | 19.7905 | 34000 | 0.3608 | 0.5666 | 0.5692 | 0.5679 | 0.9586 |
| 0.0015 | 20.0815 | 34500 | 0.3517 | 0.5674 | 0.5819 | 0.5745 | 0.9591 |
| 0.0015 | 20.3725 | 35000 | 0.3551 | 0.5603 | 0.5764 | 0.5682 | 0.9585 |
| 0.0015 | 20.6636 | 35500 | 0.3604 | 0.5724 | 0.5563 | 0.5642 | 0.9592 |
| 0.0015 | 20.9546 | 36000 | 0.3649 | 0.5709 | 0.5742 | 0.5726 | 0.9594 |
| 0.0013 | 21.2456 | 36500 | 0.3597 | 0.5482 | 0.5924 | 0.5694 | 0.9590 |
| 0.0012 | 21.5367 | 37000 | 0.3557 | 0.5710 | 0.5715 | 0.5712 | 0.9594 |
| 0.0013 | 21.8277 | 37500 | 0.3591 | 0.5774 | 0.5605 | 0.5688 | 0.9597 |
| 0.0013 | 22.1187 | 38000 | 0.3563 | 0.5679 | 0.5813 | 0.5745 | 0.9594 |
| 0.0007 | 22.4098 | 38500 | 0.3520 | 0.5590 | 0.5838 | 0.5711 | 0.9589 |
| 0.001 | 22.7008 | 39000 | 0.3606 | 0.5703 | 0.5711 | 0.5707 | 0.9593 |
| 0.001 | 22.9919 | 39500 | 0.3600 | 0.5618 | 0.5920 | 0.5765 | 0.9593 |
| 0.001 | 23.2829 | 40000 | 0.3595 | 0.5635 | 0.5709 | 0.5672 | 0.9586 |
| 0.0008 | 23.5739 | 40500 | 0.3658 | 0.5735 | 0.5689 | 0.5712 | 0.9592 |
| 0.001 | 23.8650 | 41000 | 0.3589 | 0.5677 | 0.5793 | 0.5734 | 0.9594 |
| 0.0007 | 24.1560 | 41500 | 0.3704 | 0.5819 | 0.5708 | 0.5763 | 0.9600 |
| 0.0008 | 24.4470 | 42000 | 0.3656 | 0.5779 | 0.5702 | 0.5740 | 0.9595 |
| 0.0007 | 24.7381 | 42500 | 0.3683 | 0.5647 | 0.5791 | 0.5718 | 0.9594 |
| 0.0008 | 25.0291 | 43000 | 0.3781 | 0.5766 | 0.5638 | 0.5701 | 0.9596 |
| 0.0007 | 25.3201 | 43500 | 0.3782 | 0.5738 | 0.5602 | 0.5669 | 0.9593 |
| 0.0006 | 25.6112 | 44000 | 0.3658 | 0.5736 | 0.5750 | 0.5743 | 0.9593 |
| 0.0006 | 25.9022 | 44500 | 0.3688 | 0.5788 | 0.5610 | 0.5698 | 0.9595 |
| 0.0005 | 26.1932 | 45000 | 0.3696 | 0.5769 | 0.5842 | 0.5805 | 0.9599 |
| 0.0004 | 26.4843 | 45500 | 0.3710 | 0.5781 | 0.5797 | 0.5789 | 0.9599 |
| 0.0004 | 26.7753 | 46000 | 0.3742 | 0.5772 | 0.5670 | 0.5721 | 0.9595 |
| 0.0006 | 27.0664 | 46500 | 0.3725 | 0.5672 | 0.5869 | 0.5769 | 0.9595 |
| 0.0003 | 27.3574 | 47000 | 0.3750 | 0.5613 | 0.5874 | 0.5740 | 0.9595 |
| 0.0004 | 27.6484 | 47500 | 0.3728 | 0.5633 | 0.5887 | 0.5757 | 0.9596 |
| 0.0005 | 27.9395 | 48000 | 0.3728 | 0.5645 | 0.5793 | 0.5718 | 0.9596 |
| 0.0003 | 28.2305 | 48500 | 0.3766 | 0.5719 | 0.5793 | 0.5756 | 0.9598 |
| 0.0003 | 28.5215 | 49000 | 0.3821 | 0.5697 | 0.5793 | 0.5744 | 0.9598 |
| 0.0004 | 28.8126 | 49500 | 0.3774 | 0.5726 | 0.5735 | 0.5731 | 0.9596 |
| 0.0004 | 29.1036 | 50000 | 0.3798 | 0.5779 | 0.5734 | 0.5756 | 0.9598 |
| 0.0004 | 29.3946 | 50500 | 0.3764 | 0.5721 | 0.5786 | 0.5753 | 0.9596 |
| 0.0003 | 29.6857 | 51000 | 0.3763 | 0.5731 | 0.5800 | 0.5766 | 0.9597 |
| 0.0002 | 29.9767 | 51500 | 0.3770 | 0.5737 | 0.5773 | 0.5755 | 0.9597 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.19.1