File size: 1,948 Bytes
c25ae4f 379119a c25ae4f 379119a c25ae4f 379119a c25ae4f 379119a c25ae4f 379119a c25ae4f 379119a c25ae4f 794f4fc c25ae4f 56e7b17 379119a c25ae4f 379119a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
license: cc
language:
- en
library_name: transformers
tags:
- social media
- contrastive learning
---
# The Skipped Beat: A Study of Sociopragmatic Understanding in LLMs for 64 Languages
<p align="center"> <a href="https://chiyuzhang94.github.io/" target="_blank">Chiyu Zhang</a>, Khai Duy Doan, Qisheng Liao, <a href="https://mageed.arts.ubc.ca/" target="_blank">Muhammad Abdul-Mageed</a></p>
<p align="center" float="left">
<p align="center" float="left">
The University of British Columbia, Mohamed bin Zayed University of Artificial Intelligence
</p>
<p align="center">Publish at Main Conference of EMNLP 2023</p>
<p align="center"> <a href="https://arxiv.org/abs/2310.14557" target="_blank">Paper</a></p>
[![Code License](https://img.shields.io/badge/Code%20License-Apache_2.0-green.svg)]()
[![Data License](https://img.shields.io/badge/Data%20License-CC%20By%20NC%204.0-red.svg)]()
## Checkpoints of Models Pre-Trained with InfoDCL
We further pretrained XLMR/RoBERTa with InfoDCL framework by ([Zhang et al. 2023](https://aclanthology.org/2023.findings-acl.152/))
Multilingual Model:
* InfoDCL-XLMR trained with multilingual TweetEmoji-multi: https://huggingface.co/UBC-NLP/InfoDCL-Emoji-XLMR-Base
English Models:
* InfoDCL-RoBERTa trained with TweetEmoji-EN: https://huggingface.co/UBC-NLP/InfoDCL-emoji
* InfoDCL-RoBERTa trained with TweetHashtag-EN: https://huggingface.co/UBC-NLP/InfoDCL-hashtag
## Citation
Please cite us if you find our data or models useful.
```bibtex
@inproceedings{zhang-etal-2023-skipped,
title = "The Skipped Beat: A Study of Sociopragmatic Understanding in LLMs for 64 Languages",
author = "Zhang, Chiyu and
Khai Duy Doan and,
Qisheng Liao and,
Abdul-Mageed, Muhammad",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
year = "2023",
publisher = "Association for Computational Linguistics",
}
``` |