Text Generation
GGUF
English
Inference Endpoints
conversational

Failed to regenerate message

#1
by PeterCastler - opened

I downloaded this model via LM Studio, and after loading it up, every attempt I make at prompting yields no results.
What am I missing? What can I do to get this to work? :)

Every prompt returns the same error message:
"Failed to regenerate message
Cannot call something that is not a function: got UndefinedValue"

"Assistant
This message contains no content.
The AI has nothing to say."

Thank you in advance!

image.png

image.png

Go to My Models > Gear icon next to this model > Prompt

Change Prompt Template from

{%- set ns = namespace(found=false) -%}{%- for message in messages -%}{%- if message['role'] == 'system' -%}{%- set ns.found = true -%}{%- endif -%}{%- endfor -%}{%- for message in messages %}{%- if message['role'] == 'system' -%}{{- '<|im_start|>system\n' + message['content'].rstrip() + '<|im_end|>\n' -}}{%- else -%}{%- if message['role'] == 'user' -%}{{-'<|im_start|>user\n' + message['content'].rstrip() + '<|im_end|>\n'-}}{%- else -%}{{-'<|im_start|>assistant\n' + message['content'] + '<|im_end|>\n' -}}{%- endif -%}{%- endif -%}{%- endfor -%}{%- if add_generation_prompt -%}{{-'<|im_start|>assistant\n'-}}{%- endif -%}

to

{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}

And go back to chat, it should work then. LM Studio seemingly can't handle the prompt template that's embedded by default in this model.

Sign up or log in to comment