DistilBERT base uncased, fine-tuned for NER using the conll03 english dataset. Note that this model is not sensitive to capital letters β€” "english" is the same as "English". For the case sensitive version, please use elastic/distilbert-base-cased-finetuned-conll03-english.

Versions

  • Transformers version: 4.3.1
  • Datasets version: 1.3.0

Training

$ run_ner.py \
  --model_name_or_path distilbert-base-uncased \
  --label_all_tokens True \
  --return_entity_level_metrics True \
  --dataset_name conll2003 \
  --output_dir /tmp/distilbert-base-uncased-finetuned-conll03-english \
  --do_train \
  --do_eval

After training, we update the labels to match the NER specific labels from the dataset conll2003

Downloads last month
4,582
Safetensors
Model size
66.4M params
Tensor type
F32
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for elastic/distilbert-base-uncased-finetuned-conll03-english

Finetunes
1 model

Dataset used to train elastic/distilbert-base-uncased-finetuned-conll03-english

Spaces using elastic/distilbert-base-uncased-finetuned-conll03-english 7

Evaluation results