Text Generation
Transformers
PyTorch
code
gpt2
custom_code
Eval Results
text-generation-inference
Inference Endpoints
santacoder / tokenizer_config.json
ncoop57's picture
Add max length to tokenizer
aaeed52
raw
history blame
363 Bytes
{
"name_or_path": "bigcode/digit-bytelevel-bpe-jss-v1.1-49152",
"special_tokens_map_file": "/Users/leandro/.cache/huggingface/hub/models--bigcode--digit-bytelevel-bpe-jss-v1.1-49152/snapshots/fa09b77949689a484afafc5f89534e6b6ba2c151/special_tokens_map.json",
"tokenizer_class": "PreTrainedTokenizerFast",
"vocab_size": 49152,
"model_max_length": 2048
}