Prompt format
Hi! I would like to confirm that prompt format is:
<|begin▁of▁sentence|>{system_prompt}<|User|>{prompt}<|Assistant|><|end▁of▁sentence|><|Assistant|>
and not:
<|begin▁of▁sentence|>{system_prompt}<|User|>{prompt}<|Assistant|><|end▁of▁sentence|><|User|>
Thank you ! :)
Have you manage to make them think? Mine can't and it just provide me with answer just like normal model unless i stated step by step like usual. I want to make it think inside <> or anything.
I'm waiting until llama-cpp-python works on this. In general, to push reasoning, I use a system prompt like this:
You are a friendly AI assistant. Reason through the query, then reflect on your reasoning, and finally provide your response.
You can replace the "You are a friendly AI assistant." part with whatever personality traits you want :)
Yes, that is what i did too, i stated them not to provide me the answer unless they finish their reasoning. Guess we really have to wait for updates. Thanks for answering :)
Ah no looks like the prompt extraction went weird for my page
what you see on my page is what i rendered, not what is saved in the template itself if you can use that instead
I'll update the pages though
Ok, thank you! Also thank you for your quants :)
Have you manage to make them think? Mine can't and it just provide me with answer just like normal model unless i stated step by step like usual. I want to make it think inside <> or anything.
The model does the thinking only if using the proper instruct format. If you wrote the format yourself make sure to be using the "|" and "▁" characters, which are different from "|" and "_"
When done correctly, it should output something like that:
[think]
The user wants to [do something]
bunch of steps
[/think]
Actual answer
Expect some (bad) front-ends to "eat" the 'think' tags (and sometimes, everything between the two) because people can't code properly anymore apparently.