Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
bowphs
/
testid
like
0
Token Classification
Transformers
Safetensors
roberta
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
testid
1 contributor
History:
3 commits
bowphs
Upload tokenizer
11437fc
verified
16 days ago
.gitattributes
Safe
1.52 kB
initial commit
16 days ago
README.md
Safe
5.17 kB
Upload RobertaForTokenClassification
16 days ago
config.json
Safe
1.2 kB
Upload RobertaForTokenClassification
16 days ago
merges.txt
Safe
1.34 MB
Upload tokenizer
16 days ago
model.safetensors
Safe
502 MB
LFS
Upload RobertaForTokenClassification
16 days ago
special_tokens_map.json
Safe
957 Bytes
Upload tokenizer
16 days ago
tokenizer.json
Safe
5.42 MB
Upload tokenizer
16 days ago
tokenizer_config.json
Safe
1.37 kB
Upload tokenizer
16 days ago
vocab.json
Safe
1.69 MB
Upload tokenizer
16 days ago