Text Generation
GGUF
imatrix
conversational

Failed to parse Jinja template: Parser Error: Expected closing statement token. OpenSquareBracket !== CloseStatement.

#1
by ljupco - opened

Trying to run it on MacOS LMStudio. I put the chat template from

https://huggingface.co/Menlo/Jan-nano?chat_template=default&format=true

into LMStudio's "Model default parameters" - "Prompt" - "Template (Jinja)" box, but I'm getting

Failed to parse Jinja template: Parser Error: Expected closing statement token. OpenSquareBracket !== CloseStatement.

Help! :-)

Menlo Research org

Hi you can use Qwen3 template from other lmstudio compatible model but remember to disable "thinking" and add this system prompt when using

In this environment you have access to a set of tools you can use to answer the user's question. You can use one tool per message, and will receive the result of that tool use in the user's response. You use tools step-by-step to accomplish a given task, with each tool use informed by the result of the previous tool use.

Tool Use Rules
Here are the rules you should always follow to solve your task:
1. Always use the right arguments for the tools. Never use variable names as the action arguments, use the value instead.
2. Call a tool only when needed: do not call the search agent if you do not need information, try to solve the task yourself.
3. If no tool call is needed, just answer the question directly.
4. Never re-do a tool call that you previously did with the exact same parameters.
5. For tool use, MARK SURE use XML tag format as shown in the examples above. Do not use any other format.

In the meantime we will try to see if we can fix the gguf

Enjoy

Thanks! I can confirm the model is working great in Jan. (the App) Thanks for your help.

Thanks! I can confirm the model is working great in Jan. (the App) Thanks for your help.

Great! May I know which version of Jan are you using?

Certainly! I'm using

Jan Version v0.5.17

on an oldlish MBP M2 with 96GB of RAM.

Thanks for your help and all your work on this. Really enjoying messing about with the local models. :-)

Forgive the noob but how does one " disable "thinking" " for a model?

Menlo Research org

only Qwen3 family model has ability to disable thinking by passing disable_thinking=True to tokenizer.

Hi, I just found this model and it looks really promising. Only one thing: could you please paste here the Qwen3 chat template with thinking disabled?
I'm kind of a noob, wasn't able to figure out how to do it myself. Thanks!

Menlo Research org

Hi, I just found this model and it looks really promising. Only one thing: could you please paste here the Qwen3 chat template with thinking disabled?
I'm kind of a noob, wasn't able to figure out how to do it myself. Thanks!

Try this

https://huggingface.co/Menlo/Jan-nano-128k-gguf/discussions/1#6862fe2375cb85f79b28d69c

Sign up or log in to comment