Make sure to enable repeat-penalty for this model (latest llama.cpp has it disabled by default). a2a7102 verified CISCai commited on Apr 7
Corrected tool_choice parameter in llama-cpp-python example 0e8e952 verified CISCai commited on Apr 5
Minor chat template fix and requantized for IQ3_S improvements ac17bfd verified CISCai commited on Mar 2