metadata
language:
- hu
tags:
- text-classification
license: mit
metrics:
- accuracy
widget:
- text: Jó reggelt! majd küldöm az élményhozókat :).
Hungarian Sentence-level Sentiment Analysis Model with XLM-RoBERTa
For further models, scripts and details, see our repository or our demo site.
- Pretrained model used: XLM-RoBERTa base
- Finetuned on Hungarian Twitter Sentiment (HTS) Corpus
- Labels: 0 (very negative), 1 (negative), 2 (neutral), 3 (positive), 4 (very positive)
Limitations
- max_seq_length = 128
Results
Model | HTS2 | HTS5 |
---|---|---|
huBERT | 85.56 | 68.99 |
XLM-RoBERTa | 85.56 | 66.50 |
Citation
If you use this model, please cite the following paper:
@article {laki-yang-sentiment,
author = {Laki, László János and Yang, Zijian Győző},
title = {Sentiment Analysis with Neural Models for Hungarian},
journal = {Acta Polytechnica Hungarica},
year = {2023},
publisher = {Obuda University},
volume = {20},
number = {5},
doi = {10.12700/APH.20.5.2023.5.8},
pages= {109--128},
url = {https://acta.uni-obuda.hu/Laki_Yang_134.pdf}
}