gpt2-multiqg / special_tokens_map.json
marksverdhei
add tokenizer
be92179
raw
history blame contribute delete
99 Bytes
{
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"unk_token": "<|endoftext|>"
}