File size: 780 Bytes
9f8a090
 
 
59ffa36
3e8334b
 
c18b146
3e8334b
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---
license: cc-by-sa-4.0
---
# Flair-abbr-roberta-pubmed-plos-filtered

This is a stacked model of embeddings from [roberta-large](https://huggingface.co/FacebookAI/roberta-large), [HunFlair pubmed models](https://github.com/flairNLP/flair/blob/master/resources/docs/HUNFLAIR.md) and [character-level language models trained on PLOS](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection/tree/main/clm), fine-tuning on the [PLODv2 filtered dataset](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection).
It is released with our LREC-COLING 2024 publication (coming soon). It achieves the following results on the test set:

Results on abbreviations:
- Precision: 0.8924
- Recall: 0.9375
- F1: 0.9144


Results on long forms:
- Precision: 0.8750
- Recall: 0.9225
- F1: 0.8981