Llama.cpp server chat template

#4
by softwareweaver - opened

What is the correct chat template for this model for Llama.cpp server? Do I need to use a jinja template?

  • Thanks, Ash
Unsloth AI org

What is the correct chat template for this model for Llama.cpp server? Do I need to use a jinja template?

  • Thanks, Ash

It's in our model card and blog post! :) https://unsloth.ai/blog/deepseek-r1

Thanks for replying and creating the GGUF file. I did check the blog post earlier :-)

I could not figure how to convert the prompt format below to something that can be used by llama.cpp server app. I am using Open WebUI client with the llama.cpp server.
--prompt '<|User|>What is 1+1?<|Assistant|>' \

For mistral based models, I use --chat-template llama2 as parameter to it.

Sign up or log in to comment