The model doesn't report the ChatML EOS token
#1
by
Kenshiro-28
- opened
When running the model it reports the '< / s >' EOS token instead of <|im_end|>
llm_load_print_meta: EOS token = 2 '< / s >'
Yes, but that is because Microsoft used no special tokens in their tokenizer_config file.
Ok, thx! :)
Kenshiro-28
changed discussion status to
closed