Simone Tedeschi
commited on
Commit
•
4cc3026
1
Parent(s):
132b616
added "how to use"
Browse files
README.md
CHANGED
@@ -39,6 +39,25 @@ task_ids:
|
|
39 |
- **Official Repository:** [https://github.com/Babelscape/wikineural](https://github.com/Babelscape/wikineural)
|
40 |
- **Paper:** [https://aclanthology.org/wikineural](https://aclanthology.org/2021.findings-emnlp.215/)
|
41 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
42 |
## Licensing Information
|
43 |
|
44 |
Contents of this repository are restricted to only non-commercial research purposes under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/). Copyright of the dataset contents and models belongs to the original copyright holders.
|
|
|
39 |
- **Official Repository:** [https://github.com/Babelscape/wikineural](https://github.com/Babelscape/wikineural)
|
40 |
- **Paper:** [https://aclanthology.org/wikineural](https://aclanthology.org/2021.findings-emnlp.215/)
|
41 |
|
42 |
+
#### How to use
|
43 |
+
|
44 |
+
You can use this model with Transformers *pipeline* for NER.
|
45 |
+
|
46 |
+
```python
|
47 |
+
from transformers import AutoTokenizer, AutoModelForTokenClassification
|
48 |
+
from transformers import pipeline
|
49 |
+
tokenizer = AutoTokenizer.from_pretrained("Babelscape/wikineural-multilingual-ner")
|
50 |
+
model = AutoModelForTokenClassification.from_pretrained("Babelscape/wikineural-multilingual-ner")
|
51 |
+
nlp = pipeline("ner", model=model, tokenizer=tokenizer)
|
52 |
+
example = "My name is Wolfgang and I live in Berlin"
|
53 |
+
ner_results = nlp(example)
|
54 |
+
print(ner_results)
|
55 |
+
```
|
56 |
+
|
57 |
+
#### Limitations and bias
|
58 |
+
|
59 |
+
This model is trained on WikiNEuRal, a state-of-the-art dataset for Multilingual NER automatically derived from Wikipedia. Therefore, it may not generalize well on all textual genres (e.g. news). On the other hand, models trained only on news articles (e.g. only on CoNLL03) have been proven to obtain much lower scores on encyclopedic articles. To obtain a more robust system, we encourage to train a system on the combination of WikiNEuRal + CoNLL.
|
60 |
+
|
61 |
## Licensing Information
|
62 |
|
63 |
Contents of this repository are restricted to only non-commercial research purposes under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/). Copyright of the dataset contents and models belongs to the original copyright holders.
|