anniew666's picture
Update README.md
a9467bc
---
license: mit
base_model: roberta-large
tags:
- generated_from_trainer
metrics:
- accuracy
- recall
- f1
model-index:
- name: lora-roberta-large_6
results: []
library_name: peft
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# lora-roberta-large_6
This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7708
- Accuracy: 0.7226
- Prec: 0.6770
- Recall: 0.6754
- F1: 0.6758
- B Acc: 0.6754
- Micro F1: 0.7226
- Prec Joy: 0.8170
- Recall Joy: 0.8717
- F1 Joy: 0.8435
- Prec Anger: 0.6503
- Recall Anger: 0.6330
- F1 Anger: 0.6415
- Prec Disgust: 0.5151
- Recall Disgust: 0.4920
- F1 Disgust: 0.5033
- Prec Fear: 0.7595
- Recall Fear: 0.7766
- F1 Fear: 0.7680
- Prec Neutral: 0.7003
- Recall Neutral: 0.6541
- F1 Neutral: 0.6764
- Prec Sadness: 0.7354
- Recall Sadness: 0.7598
- F1 Sadness: 0.7474
- Prec Surprise: 0.5613
- Recall Surprise: 0.5406
- F1 Surprise: 0.5508
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 10.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Prec | Recall | F1 | B Acc | Micro F1 | Prec Joy | Recall Joy | F1 Joy | Prec Anger | Recall Anger | F1 Anger | Prec Disgust | Recall Disgust | F1 Disgust | Prec Fear | Recall Fear | F1 Fear | Prec Neutral | Recall Neutral | F1 Neutral | Prec Sadness | Recall Sadness | F1 Sadness | Prec Surprise | Recall Surprise | F1 Surprise |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|:--------:|:--------:|:----------:|:------:|:----------:|:------------:|:--------:|:------------:|:--------------:|:----------:|:---------:|:-----------:|:-------:|:------------:|:--------------:|:----------:|:------------:|:--------------:|:----------:|:-------------:|:---------------:|:-----------:|
| 1.0422 | 0.5 | 338 | 1.0252 | 0.6383 | 0.6048 | 0.5558 | 0.5679 | 0.5558 | 0.6383 | 0.7408 | 0.8105 | 0.7741 | 0.6150 | 0.4301 | 0.5062 | 0.5039 | 0.4089 | 0.4515 | 0.6016 | 0.6311 | 0.616 | 0.5776 | 0.6623 | 0.6171 | 0.6128 | 0.6821 | 0.6456 | 0.5819 | 0.2654 | 0.3646 |
| 0.9507 | 1.0 | 676 | 0.9057 | 0.6720 | 0.6285 | 0.6125 | 0.6124 | 0.6125 | 0.6720 | 0.8439 | 0.7699 | 0.8052 | 0.6647 | 0.4487 | 0.5357 | 0.3456 | 0.5112 | 0.4124 | 0.6759 | 0.6967 | 0.6862 | 0.5849 | 0.7567 | 0.6598 | 0.7325 | 0.6496 | 0.6886 | 0.5519 | 0.4550 | 0.4988 |
| 0.857 | 1.5 | 1014 | 0.8438 | 0.6919 | 0.6551 | 0.6178 | 0.6317 | 0.6178 | 0.6919 | 0.7876 | 0.8508 | 0.8180 | 0.6563 | 0.5295 | 0.5861 | 0.5631 | 0.3706 | 0.4470 | 0.6896 | 0.7193 | 0.7041 | 0.6378 | 0.6808 | 0.6586 | 0.7049 | 0.7207 | 0.7127 | 0.5464 | 0.4529 | 0.4953 |
| 0.854 | 2.0 | 1352 | 0.8394 | 0.6889 | 0.6466 | 0.6265 | 0.6224 | 0.6265 | 0.6889 | 0.7776 | 0.8634 | 0.8183 | 0.5476 | 0.6742 | 0.6043 | 0.6183 | 0.2588 | 0.3649 | 0.6074 | 0.7766 | 0.6817 | 0.7014 | 0.5789 | 0.6343 | 0.7265 | 0.7213 | 0.7239 | 0.5475 | 0.5125 | 0.5294 |
| 0.816 | 2.5 | 1690 | 0.8139 | 0.7003 | 0.6535 | 0.6491 | 0.6477 | 0.6491 | 0.7003 | 0.7870 | 0.8572 | 0.8206 | 0.6328 | 0.5392 | 0.5823 | 0.4318 | 0.5463 | 0.4824 | 0.7305 | 0.75 | 0.7401 | 0.6650 | 0.6688 | 0.6669 | 0.7220 | 0.7411 | 0.7314 | 0.6057 | 0.4410 | 0.5103 |
| 0.7876 | 3.0 | 2028 | 0.7993 | 0.7046 | 0.6800 | 0.6331 | 0.6435 | 0.6331 | 0.7046 | 0.8217 | 0.8347 | 0.8281 | 0.6518 | 0.5901 | 0.6194 | 0.6825 | 0.2748 | 0.3918 | 0.7182 | 0.7520 | 0.7347 | 0.6613 | 0.6825 | 0.6717 | 0.7181 | 0.7285 | 0.7233 | 0.5063 | 0.5688 | 0.5357 |
| 0.7702 | 3.51 | 2366 | 0.7760 | 0.7089 | 0.6819 | 0.6232 | 0.6415 | 0.6232 | 0.7089 | 0.8217 | 0.8508 | 0.8360 | 0.6257 | 0.6015 | 0.6134 | 0.5932 | 0.3355 | 0.4286 | 0.7407 | 0.7377 | 0.7392 | 0.6402 | 0.7273 | 0.6810 | 0.7131 | 0.7453 | 0.7289 | 0.6388 | 0.3640 | 0.4638 |
| 0.7555 | 4.01 | 2704 | 0.7814 | 0.7131 | 0.6807 | 0.6286 | 0.6449 | 0.6286 | 0.7131 | 0.8254 | 0.8469 | 0.8360 | 0.6161 | 0.6330 | 0.6244 | 0.5854 | 0.3067 | 0.4025 | 0.7382 | 0.7398 | 0.7390 | 0.6541 | 0.7225 | 0.6866 | 0.7254 | 0.7459 | 0.7355 | 0.6202 | 0.4052 | 0.4902 |
| 0.741 | 4.51 | 3042 | 0.7749 | 0.7109 | 0.6598 | 0.6501 | 0.6528 | 0.6501 | 0.7109 | 0.7999 | 0.8630 | 0.8303 | 0.6406 | 0.6095 | 0.6247 | 0.4922 | 0.4026 | 0.4429 | 0.6655 | 0.7828 | 0.7194 | 0.6733 | 0.6852 | 0.6792 | 0.7714 | 0.6948 | 0.7311 | 0.5754 | 0.5125 | 0.5421 |
| 0.7095 | 5.01 | 3380 | 0.7921 | 0.7098 | 0.6848 | 0.6314 | 0.6435 | 0.6314 | 0.7098 | 0.8150 | 0.8521 | 0.8331 | 0.6051 | 0.6492 | 0.6264 | 0.6585 | 0.2588 | 0.3716 | 0.7590 | 0.7357 | 0.7471 | 0.6878 | 0.6514 | 0.6691 | 0.6712 | 0.8025 | 0.7310 | 0.5970 | 0.4702 | 0.5261 |
| 0.682 | 5.51 | 3718 | 0.7824 | 0.7135 | 0.6527 | 0.6808 | 0.6634 | 0.6808 | 0.7135 | 0.8204 | 0.8595 | 0.8395 | 0.6605 | 0.5740 | 0.6142 | 0.4507 | 0.5847 | 0.5090 | 0.6404 | 0.8176 | 0.7183 | 0.6955 | 0.6541 | 0.6742 | 0.7398 | 0.7429 | 0.7414 | 0.5616 | 0.5330 | 0.5470 |
| 0.6866 | 6.01 | 4056 | 0.7794 | 0.7169 | 0.6916 | 0.6527 | 0.6648 | 0.6527 | 0.7169 | 0.7892 | 0.8817 | 0.8329 | 0.6514 | 0.6435 | 0.6474 | 0.6391 | 0.3450 | 0.4481 | 0.7964 | 0.7295 | 0.7615 | 0.7066 | 0.6306 | 0.6664 | 0.7380 | 0.7580 | 0.7478 | 0.5209 | 0.5807 | 0.5492 |
| 0.6675 | 6.51 | 4394 | 0.7683 | 0.7185 | 0.6741 | 0.6623 | 0.6666 | 0.6623 | 0.7185 | 0.8392 | 0.8479 | 0.8435 | 0.6157 | 0.6605 | 0.6373 | 0.5502 | 0.4377 | 0.4875 | 0.7325 | 0.7520 | 0.7422 | 0.6869 | 0.6719 | 0.6793 | 0.7151 | 0.7676 | 0.7404 | 0.5793 | 0.4984 | 0.5358 |
| 0.6627 | 7.01 | 4732 | 0.7740 | 0.7128 | 0.6803 | 0.6512 | 0.6578 | 0.6512 | 0.7128 | 0.8170 | 0.8717 | 0.8435 | 0.5925 | 0.6912 | 0.6381 | 0.6258 | 0.3259 | 0.4286 | 0.7723 | 0.7439 | 0.7578 | 0.7017 | 0.6141 | 0.6550 | 0.7269 | 0.7580 | 0.7421 | 0.5263 | 0.5536 | 0.5396 |
| 0.649 | 7.51 | 5070 | 0.7704 | 0.7188 | 0.6870 | 0.6538 | 0.6644 | 0.6538 | 0.7188 | 0.8133 | 0.8679 | 0.8397 | 0.6294 | 0.6548 | 0.6418 | 0.6257 | 0.3578 | 0.4553 | 0.7530 | 0.7684 | 0.7606 | 0.6890 | 0.6596 | 0.6740 | 0.7144 | 0.7664 | 0.7395 | 0.5839 | 0.5016 | 0.5396 |
| 0.6506 | 8.01 | 5408 | 0.7820 | 0.7166 | 0.6728 | 0.6580 | 0.6631 | 0.6580 | 0.7166 | 0.7959 | 0.8824 | 0.8369 | 0.6361 | 0.6403 | 0.6382 | 0.4980 | 0.4026 | 0.4452 | 0.7887 | 0.7418 | 0.7645 | 0.7194 | 0.6196 | 0.6658 | 0.6969 | 0.7905 | 0.7408 | 0.5741 | 0.5287 | 0.5505 |
| 0.6054 | 8.51 | 5746 | 0.7668 | 0.7207 | 0.6778 | 0.6675 | 0.6717 | 0.6675 | 0.7207 | 0.8272 | 0.8640 | 0.8452 | 0.6337 | 0.6475 | 0.6405 | 0.5490 | 0.4473 | 0.4930 | 0.7557 | 0.7480 | 0.7518 | 0.6937 | 0.6555 | 0.6740 | 0.7328 | 0.7610 | 0.7466 | 0.5523 | 0.5493 | 0.5508 |
| 0.6044 | 9.01 | 6084 | 0.7678 | 0.7206 | 0.6729 | 0.6652 | 0.6679 | 0.6652 | 0.7206 | 0.8235 | 0.8634 | 0.8430 | 0.6180 | 0.6564 | 0.6366 | 0.5110 | 0.4441 | 0.4752 | 0.7216 | 0.7807 | 0.75 | 0.6861 | 0.6811 | 0.6836 | 0.7553 | 0.7249 | 0.7398 | 0.5949 | 0.5060 | 0.5468 |
| 0.571 | 9.51 | 6422 | 0.7741 | 0.7210 | 0.6832 | 0.6609 | 0.6698 | 0.6609 | 0.7210 | 0.8221 | 0.8682 | 0.8445 | 0.6207 | 0.6629 | 0.6411 | 0.5579 | 0.4153 | 0.4762 | 0.8032 | 0.7111 | 0.7543 | 0.7040 | 0.6528 | 0.6774 | 0.7279 | 0.7634 | 0.7452 | 0.5466 | 0.5525 | 0.5496 |
### Framework versions
- Transformers 4.32.0.dev0
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.11.0