camembert-large / special_tokens_map.json

Commit History

Added Fast tokenizer files with model_max_length
bc571e2
verified

wissamantoun commited on