german-gpt2 / tokenizer_config.json
stefan-it's picture
vocab: add vocab (incl. merged) for German GPT-2 model. Vocab was created with Tokenizers library and includes tokenizer configuration
a5f1180
raw
history blame
62 Bytes
{"special_tokens_map_file": null, "full_tokenizer_file": null}