Spaces:
Running
on
A10G
Running
on
A10G
mamba codestral support ?
#118
by
Daemontatox
- opened
Error: Error converting to fp16: b'INFO:hf-to-gguf:Loading model: Mamba-Codestral-7B-v0.1\nERROR:hf-to-gguf:Model Mamba2ForCausalLM is not supported\n'
It is not yet supported by llama.cpp
Please follow PR: https://github.com/ggerganov/llama.cpp/pull/9126
Oh ok thnx
Daemontatox
changed discussion status to
closed