Format Chat ml?
#1
by
Shqmil
- opened
Format Chat ml
Original uploader didn't specify, tested with alpaca but think it is a base model and needs to be finetuned before it does well with any format
the original tokenizer_config.json suggests it doesn’t support system prompts, it wraps user messages in llama2/mistral instruction tokens ([INST]
, [/INST]
)
https://huggingface.co/BEE-spoke-data/Mixtral-GQA-400m-v2/blob/6f8c51d1bf60da6f8e64ba7fb75fb747d9b124cf/tokenizer_config.json#L32