Update chat template

#53
by Rocketknight1 HF staff - opened

This new chat template adds proper support for tool calling, and also fixes issues with missing support for add_generation_prompt.

We also found issues with apply_chat_template not accepting the add_generation_prompt flag yesterday. Glad that this is getting patched!

@Rocketknight1 thank you for your great work, I read from https://llama.meta.com/docs/model-cards-and-prompt-formats/llama3_1#user-defined-custom-tool-calling
It seems that the custom tools, if need to call, will be in the format:
<function= {function_name} >{parameters}<|eom_id|>
but I don't see: "<function" and "</function" in your chat template, can you clarify this?
Screen Shot 2024-07-27 at 21.55.02.png

Please bump the version if you're making changes, so that this is 3.1.2 or whatever.

Hi @khaimai , there are actually two tool-calling formats in that doc. The chat template will use the "JSON based tool calling" format. We may consider adding a flag to the template to allow it to use both, but for now if you want to use the <function=> syntax you'll need to write the prompt manually.

Also @JoeSmith245 , unfortunately model names don't work like that! Bugfixes and updates to the tokenizer or chat template that don't change the model weights are usually treated as the same model, especially in cases like this where the change just adds template functionality without changing the model behaviour.

pcuenq changed pull request status to merged
This comment has been hidden

Sign up or log in to comment