Datasets:

Modalities:
Text
Languages:
English
ArXiv:
Libraries:
Datasets
License:
dipteshkanojia commited on
Commit
3ea7e16
1 Parent(s): e31fb85

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -9,7 +9,7 @@ licenses:
9
  - cc-by-sa4.0
10
  multilinguality:
11
  - monolingual
12
- paperswithcode_id: acronym-identification
13
  pretty_name: 'PLOD: An Abbreviation Detection Dataset'
14
  size_categories:
15
  - 100K<n<1M
@@ -156,10 +156,14 @@ Now, you can use the notebook to reproduce the experiments.
156
 
157
  ### Model(s)
158
 
159
- The working model(s) are present here at these links:<br/>
160
 
161
- - [RoBERTa-Large (Unfiltered)](https://huggingface.co/surrey-nlp/roberta-large-finetuned-abbr)
162
- - [RoBERTa-Base (Filtered)](https://huggingface.co/surrey-nlp/roberta-base-finetuned-abbr)
 
 
 
 
 
163
 
164
  On the link provided above, the model(s) can be used with the help of the Inference API via the web-browser itself. We have placed some examples with the API for testing.<br/>
165
 
 
9
  - cc-by-sa4.0
10
  multilinguality:
11
  - monolingual
12
+ paperswithcode_id: plod-filtered
13
  pretty_name: 'PLOD: An Abbreviation Detection Dataset'
14
  size_categories:
15
  - 100K<n<1M
 
156
 
157
  ### Model(s)
158
 
 
159
 
160
+ Our best performing models are hosted on the HuggingFace models repository
161
+
162
+ | Models | [`PLOD - Unfiltered`](https://huggingface.co/datasets/surrey-nlp/PLOD-unfiltered) | [`PLOD - Filtered`](https://huggingface.co/datasets/surrey-nlp/PLOD-filtered) | Description |
163
+ | --- | :---: | :---: | --- |
164
+ | [RoBERTa<sub>large</sub>](https://huggingface.co/roberta-large) | [RoBERTa<sub>large</sub>-finetuned-abbr](https://huggingface.co/surrey-nlp/roberta-large-finetuned-abbr) | -soon- | Fine-tuning on the RoBERTa<sub>large</sub> language model |
165
+ | [RoBERTa<sub>base</sub>](https://huggingface.co/roberta-base) | -soon- | [RoBERTa<sub>base</sub>-finetuned-abbr](https://huggingface.co/surrey-nlp/roberta-large-finetuned-abbr) | Fine-tuning on the RoBERTa<sub>base</sub> language model |
166
+ | [AlBERT<sub>large-v2</sub>](https://huggingface.co/albert-large-v2) | [AlBERT<sub>large-v2</sub>-finetuned-abbDet](https://huggingface.co/surrey-nlp/albert-large-v2-finetuned-abbDet) | -soon- | Fine-tuning on the AlBERT<sub>large-v2</sub> language model |
167
 
168
  On the link provided above, the model(s) can be used with the help of the Inference API via the web-browser itself. We have placed some examples with the API for testing.<br/>
169