Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
FacebookAI
/
xlm-roberta-large-finetuned-conll03-english
like
151
Follow
Facebook AI community
83
Token Classification
Transformers
PyTorch
Rust
ONNX
Safetensors
94 languages
xlm-roberta
Inference Endpoints
arxiv:
1911.02116
arxiv:
2008.03415
arxiv:
1910.09700
Model card
Files
Files and versions
Community
15
Train
Deploy
Use this model
18f95e9
xlm-roberta-large-finetuned-conll03-english
6 contributors
History:
12 commits
lysandre
HF staff
Adds the tokenizer configuration file (
#11
)
18f95e9
verified
10 months ago
onnx
Adding ONNX file of this model (#8)
about 1 year ago
.gitattributes
Safe
577 Bytes
Adding `safetensors` variant of this model (#10)
about 1 year ago
README.md
Safe
7.68 kB
Preliminary model card (#3)
over 2 years ago
config.json
Safe
852 Bytes
Update config.json
over 4 years ago
model.safetensors
Safe
2.24 GB
LFS
Adding `safetensors` variant of this model (#10)
about 1 year ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
2.24 GB
LFS
Update pytorch_model.bin
almost 5 years ago
rust_model.ot
Safe
2.24 GB
LFS
Update rust_model.ot
over 4 years ago
sentencepiece.bpe.model
Safe
5.07 MB
Update sentencepiece.bpe.model
almost 5 years ago
tokenizer.json
Safe
9.1 MB
Update tokenizer.json
about 4 years ago
tokenizer_config.json
Safe
25 Bytes
Adds the tokenizer configuration file (#11)
10 months ago