llama.cpp...
Can someone explain to me why i have to use this -p "[|system|]You are EXAONE model from LG AI Research, a helpful assistant.[|endofturn|]\n[|user|]Hello!\n[|assistant|]"
instead of this -p "You are EXAONE model from LG AI Research, a helpful assistant." ?
Otherwise the model answers all my reasoning questions incorrectly. I thought llama.cpp knows the promt format automatically...
Yeah llama.cpp with prompt formats is confusing not gonna lie. I think the llama-cli isn't able to pull in the prompt format for some reason, it would be nice if it did, but you instead need to use another function
you can find an example of pulling the proper chat template in the simple-chat.cpp: https://github.com/ggerganov/llama.cpp/blob/b685daf3867c54e42a9db484d7b92619021d4510/examples/simple-chat/simple-chat.cpp#L164
maybe would be good to include it in ./llama-cli as well, would certainly be nice