Edit model card

roberta-base-NER

Model description

xlm-roberta-base-multilingual-cased-ner is a Named Entity Recognition model based on a fine-tuned XLM-RoBERTa base model. It has been trained to recognize three types of entities: location (LOC), organizations (ORG), and person (PER). Specifically, this model is a XLMRoreberta-base-multilingual-cased model that was fine-tuned on an aggregation of 10 high-resourced languages.

Intended uses & limitations

How to use

You can use this model with Transformers pipeline for NER.

from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline

tokenizer = AutoTokenizer.from_pretrained("Tirendaz/multilingual-xlm-roberta-for-ner")
model = AutoModelForTokenClassification.from_pretrained("Tirendaz/multilingual-xlm-roberta-for-ner")

nlp = pipeline("ner", model=model, tokenizer=tokenizer)
example = "My name is Wolfgang and I live in Berlin"

ner_results = nlp(example)
print(ner_results)
Abbreviation Description
O Outside of a named entity
B-PER Beginning of a personโ€™s name right after another personโ€™s name
I-PER Personโ€™s name
B-ORG Beginning of an organisation right after another organisation
I-ORG Organisation
B-LOC Beginning of a location right after another location
I-LOC Location

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 24
  • eval_batch_size: 24
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 417 0.3359 0.7286 0.7675 0.7476 0.8991
0.4227 2.0 834 0.2951 0.7711 0.7980 0.7843 0.9131
0.2818 3.0 1251 0.2824 0.7852 0.8076 0.7962 0.9174
0.2186 4.0 1668 0.2853 0.7934 0.8150 0.8041 0.9193
0.1801 5.0 2085 0.2935 0.8004 0.8111 0.8057 0.9194

Framework versions

  • Transformers 4.33.0
  • Pytorch 2.0.0
  • Datasets 2.1.0
  • Tokenizers 0.13.3
Downloads last month
87
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Tirendaz/roberta-base-NER

Finetuned
(2585)
this model

Dataset used to train Tirendaz/roberta-base-NER

Spaces using Tirendaz/roberta-base-NER 9

Evaluation results