twitter-roberta-base-sentiment-latest-trump-stance-1
This model is a fine-tuned version of cardiffnlp/twitter-roberta-base-sentiment-latest on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.1168
- Accuracy: {'accuracy': 0.6666666666666666}
- Precision: {'precision': 0.5697940503432495}
- Recall: {'recall': 0.7302052785923754}
- F1 Score: {'f1': 0.6401028277634961}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 Score |
---|---|---|---|---|---|---|---|
0.583 | 1.0 | 3600 | 0.3772 | {'accuracy': 0.83875} | {'precision': 0.812933025404157} | {'recall': 0.88} | {'f1': 0.8451380552220888} |
0.5621 | 2.0 | 7200 | 0.3725 | {'accuracy': 0.853125} | {'precision': 0.9407176287051482} | {'recall': 0.75375} | {'f1': 0.8369188063844553} |
0.5813 | 3.0 | 10800 | 1.0373 | {'accuracy': 0.625625} | {'precision': 0.5719398711524696} | {'recall': 0.99875} | {'f1': 0.7273554847519345} |
0.5317 | 4.0 | 14400 | 0.3697 | {'accuracy': 0.875625} | {'precision': 0.8917861799217731} | {'recall': 0.855} | {'f1': 0.8730057434588385} |
0.5498 | 5.0 | 18000 | 0.4457 | {'accuracy': 0.8525} | {'precision': 0.8551637279596978} | {'recall': 0.84875} | {'f1': 0.8519447929736512} |
0.5388 | 6.0 | 21600 | 0.4715 | {'accuracy': 0.829375} | {'precision': 0.9136577708006279} | {'recall': 0.7275} | {'f1': 0.8100208768267223} |
0.5885 | 7.0 | 25200 | 0.3773 | {'accuracy': 0.85875} | {'precision': 0.8836898395721925} | {'recall': 0.82625} | {'f1': 0.8540051679586563} |
0.4961 | 8.0 | 28800 | 0.3819 | {'accuracy': 0.869375} | {'precision': 0.9053497942386831} | {'recall': 0.825} | {'f1': 0.8633093525179856} |
0.5421 | 9.0 | 32400 | 0.4011 | {'accuracy': 0.85875} | {'precision': 0.8239277652370203} | {'recall': 0.9125} | {'f1': 0.8659549228944247} |
0.5123 | 10.0 | 36000 | 0.3404 | {'accuracy': 0.88125} | {'precision': 0.9034391534391535} | {'recall': 0.85375} | {'f1': 0.877892030848329} |
0.5996 | 11.0 | 39600 | 0.3435 | {'accuracy': 0.880625} | {'precision': 0.8801498127340824} | {'recall': 0.88125} | {'f1': 0.8806995627732667} |
0.4871 | 12.0 | 43200 | 0.2972 | {'accuracy': 0.890625} | {'precision': 0.9021879021879022} | {'recall': 0.87625} | {'f1': 0.8890298034242232} |
0.5272 | 13.0 | 46800 | 0.3629 | {'accuracy': 0.874375} | {'precision': 0.9423929098966026} | {'recall': 0.7975} | {'f1': 0.8639133378469871} |
0.5897 | 14.0 | 50400 | 0.3164 | {'accuracy': 0.88} | {'precision': 0.9075067024128687} | {'recall': 0.84625} | {'f1': 0.8758085381630013} |
0.4963 | 15.0 | 54000 | 0.3343 | {'accuracy': 0.87625} | {'precision': 0.922752808988764} | {'recall': 0.82125} | {'f1': 0.8690476190476191} |
0.5132 | 16.0 | 57600 | 0.5593 | {'accuracy': 0.855625} | {'precision': 0.9330289193302892} | {'recall': 0.76625} | {'f1': 0.8414550446122169} |
0.447 | 17.0 | 61200 | 0.3651 | {'accuracy': 0.874375} | {'precision': 0.8544378698224852} | {'recall': 0.9025} | {'f1': 0.8778115501519757} |
0.5189 | 18.0 | 64800 | 0.3919 | {'accuracy': 0.878125} | {'precision': 0.9315263908701854} | {'recall': 0.81625} | {'f1': 0.8700866089273818} |
0.4835 | 19.0 | 68400 | 0.5706 | {'accuracy': 0.846875} | {'precision': 0.9541734860883797} | {'recall': 0.72875} | {'f1': 0.8263642806520198} |
0.455 | 20.0 | 72000 | 0.3523 | {'accuracy': 0.881875} | {'precision': 0.8813982521847691} | {'recall': 0.8825} | {'f1': 0.8819487820112429} |
0.4791 | 21.0 | 75600 | 0.3292 | {'accuracy': 0.884375} | {'precision': 0.8546712802768166} | {'recall': 0.92625} | {'f1': 0.8890221955608878} |
0.512 | 22.0 | 79200 | 0.4456 | {'accuracy': 0.87} | {'precision': 0.9391691394658753} | {'recall': 0.79125} | {'f1': 0.858887381275441} |
0.4783 | 23.0 | 82800 | 0.3283 | {'accuracy': 0.880625} | {'precision': 0.9188445667125172} | {'recall': 0.835} | {'f1': 0.8749181401440733} |
0.4699 | 24.0 | 86400 | 0.3399 | {'accuracy': 0.885} | {'precision': 0.9074074074074074} | {'recall': 0.8575} | {'f1': 0.8817480719794345} |
0.4485 | 25.0 | 90000 | 0.3156 | {'accuracy': 0.89} | {'precision': 0.8949367088607595} | {'recall': 0.88375} | {'f1': 0.889308176100629} |
0.4305 | 26.0 | 93600 | 0.3105 | {'accuracy': 0.894375} | {'precision': 0.9092088197146563} | {'recall': 0.87625} | {'f1': 0.8924252068746021} |
0.4704 | 27.0 | 97200 | 0.3528 | {'accuracy': 0.879375} | {'precision': 0.8634730538922155} | {'recall': 0.90125} | {'f1': 0.8819571865443425} |
0.4589 | 28.0 | 100800 | 0.3534 | {'accuracy': 0.879375} | {'precision': 0.8696711327649208} | {'recall': 0.8925} | {'f1': 0.8809376927822332} |
0.4831 | 29.0 | 104400 | 0.3315 | {'accuracy': 0.891875} | {'precision': 0.9108781127129751} | {'recall': 0.86875} | {'f1': 0.889315419065899} |
0.4931 | 30.0 | 108000 | 0.3200 | {'accuracy': 0.891875} | {'precision': 0.9185580774365821} | {'recall': 0.86} | {'f1': 0.8883150419625565} |
0.4286 | 31.0 | 111600 | 0.3488 | {'accuracy': 0.8825} | {'precision': 0.9180327868852459} | {'recall': 0.84} | {'f1': 0.8772845953002611} |
0.4309 | 32.0 | 115200 | 0.3192 | {'accuracy': 0.891875} | {'precision': 0.8875154511742892} | {'recall': 0.8975} | {'f1': 0.8924798011187073} |
0.3896 | 33.0 | 118800 | 0.3294 | {'accuracy': 0.881875} | {'precision': 0.8632580261593341} | {'recall': 0.9075} | {'f1': 0.8848263254113345} |
0.4327 | 34.0 | 122400 | 0.3003 | {'accuracy': 0.899375} | {'precision': 0.9346938775510204} | {'recall': 0.85875} | {'f1': 0.895114006514658} |
0.4179 | 35.0 | 126000 | 0.3189 | {'accuracy': 0.898125} | {'precision': 0.9368998628257887} | {'recall': 0.85375} | {'f1': 0.8933943754087639} |
0.4023 | 36.0 | 129600 | 0.3284 | {'accuracy': 0.8775} | {'precision': 0.8408577878103838} | {'recall': 0.93125} | {'f1': 0.8837485172004745} |
0.4285 | 37.0 | 133200 | 0.3221 | {'accuracy': 0.894375} | {'precision': 0.9280868385345997} | {'recall': 0.855} | {'f1': 0.8900455432661027} |
0.3988 | 38.0 | 136800 | 0.2861 | {'accuracy': 0.896875} | {'precision': 0.8905289052890529} | {'recall': 0.905} | {'f1': 0.8977061376317421} |
0.4034 | 39.0 | 140400 | 0.3501 | {'accuracy': 0.895625} | {'precision': 0.9438990182328191} | {'recall': 0.84125} | {'f1': 0.8896232650363516} |
0.3743 | 40.0 | 144000 | 0.3654 | {'accuracy': 0.886875} | {'precision': 0.9176788124156545} | {'recall': 0.85} | {'f1': 0.8825438027255029} |
0.3979 | 41.0 | 147600 | 0.3230 | {'accuracy': 0.899375} | {'precision': 0.9311740890688259} | {'recall': 0.8625} | {'f1': 0.8955223880597015} |
0.3808 | 42.0 | 151200 | 0.2978 | {'accuracy': 0.90375} | {'precision': 0.9205729166666666} | {'recall': 0.88375} | {'f1': 0.9017857142857143} |
0.3777 | 43.0 | 154800 | 0.2805 | {'accuracy': 0.899375} | {'precision': 0.9220607661822986} | {'recall': 0.8725} | {'f1': 0.8965960179833012} |
0.3631 | 44.0 | 158400 | 0.2984 | {'accuracy': 0.898125} | {'precision': 0.9163398692810457} | {'recall': 0.87625} | {'f1': 0.8958466453674121} |
0.3674 | 45.0 | 162000 | 0.2924 | {'accuracy': 0.90375} | {'precision': 0.9376693766937669} | {'recall': 0.865} | {'f1': 0.8998699609882965} |
0.3539 | 46.0 | 165600 | 0.3158 | {'accuracy': 0.89375} | {'precision': 0.899746192893401} | {'recall': 0.88625} | {'f1': 0.8929471032745592} |
0.3557 | 47.0 | 169200 | 0.2861 | {'accuracy': 0.9} | {'precision': 0.9145077720207254} | {'recall': 0.8825} | {'f1': 0.8982188295165394} |
0.38 | 48.0 | 172800 | 0.2962 | {'accuracy': 0.894375} | {'precision': 0.9029374201787995} | {'recall': 0.88375} | {'f1': 0.8932406822488945} |
0.3754 | 49.0 | 176400 | 0.2905 | {'accuracy': 0.9} | {'precision': 0.9166666666666666} | {'recall': 0.88} | {'f1': 0.8979591836734694} |
0.3717 | 50.0 | 180000 | 0.2880 | {'accuracy': 0.89875} | {'precision': 0.9153645833333334} | {'recall': 0.87875} | {'f1': 0.8966836734693877} |
Framework versions
- PEFT 0.10.0
- Transformers 4.38.2
- Pytorch 2.2.1
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 1