|
--- |
|
license: cc-by-sa-4.0 |
|
--- |
|
# Flair-abbr-roberta-pubmed-plos-filtered |
|
|
|
This is a stacked model of embeddings from [roberta-large](https://huggingface.co/FacebookAI/roberta-large), [HunFlair pubmed models](https://github.com/flairNLP/flair/blob/master/resources/docs/HUNFLAIR.md) and [character-level language models trained on PLOS](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection/tree/main/clm), fine-tuning on the [PLODv2 filtered dataset](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection). |
|
It is released with our LREC-COLING 2024 publication [Using character-level models for efficient abbreviation and long-form detection](https://aclanthology.org/2024.lrec-main.270/). It achieves the following results on the test set: |
|
|
|
Results on abbreviations: |
|
- Precision: 0.8924 |
|
- Recall: 0.9375 |
|
- F1: 0.9144 |
|
|
|
|
|
Results on long forms: |
|
- Precision: 0.8750 |
|
- Recall: 0.9225 |
|
- F1: 0.8981 |