camembert-large / added_tokens.json

Commit History

Added Fast tokenizer files with model_max_length
bc571e2
verified

wissamantoun commited on