Unable to load mmproj: `load_hparams: unknown projector type: kimivl`

#1
by ddh0 - opened

Thank you for uploading this!

When I try to load the model with the mmproj using this command:

llama-server -t 8 -tb 8 --host 0.0.0.0 --port 12800 -m ~/gguf/Kimi-VL-A3B-Thinking-2506-Q8_0-FFN-Q4_K-Q4_K-Q8_0.gguf --mmproj ~/gguf/mmproj-Kimi-VL-A3B-Thinking-2506-Q8_0.gguf -b 2048 -ub 2048 -c 25600 -fa -ngl 999

I get this error:

common_init_from_params: added [EOS] logit bias = -inf
common_init_from_params: added <|im_end|> logit bias = -inf
common_init_from_params: setting dry_penalty_last_n to ctx_size = 25600
common_init_from_params: warming up the model with an empty run - please wait ... (--no-warmup to disable)
clip_model_loader: model name:   
clip_model_loader: description:  
clip_model_loader: GGUF version: 3
clip_model_loader: alignment:    32
clip_model_loader: n_tensors:    443
clip_model_loader: n_kv:         25

clip_model_loader: has vision encoder
clip_ctx: CLIP using CUDA0 backend
clip_init: failed to load model '/home/dylan/gguf/mmproj-Kimi-VL-A3B-Thinking-2506-Q8_0.gguf': load_hparams: unknown projector type: kimivl

mtmd_init_from_file: error: Failed to load CLIP model from /home/dylan/gguf/mmproj-Kimi-VL-A3B-Thinking-2506-Q8_0.gguf

srv    load_model: failed to load multimodal model, '/home/dylan/gguf/mmproj-Kimi-VL-A3B-Thinking-2506-Q8_0.gguf'
srv    operator(): operator(): cleaning up before exit...
main: exiting due to model loading error

Ultimately the cause is unknown projector type: kimivl, which is weird because I'm on the very latest llama.cpp (build: 6220 (5682a374)).

Any potential fix would be greatly appreciated. Thank you for your time @ngxson

Oh, I just saw that https://github.com/ggml-org/llama.cpp/pull/15458 is not merged

Currently not usable for me for images.

Model vision often reports "staircases" and "pixelated images" when none such are present.

llama-server version: 6810 (84bf3c67) built oct 2025

Sign up or log in to comment