Can't create gguf quants using the safetensors

#2
by Doctor-Chad-PhD - opened

Hi Mistral,

I'm aware that there is an official BF16 gguf but when I'm trying to create my own I get this error:

    raise ValueError(f"Can not map tensor {name!r}")
ValueError: Can not map tensor 'model.layers.0.mlp.down_proj.activation_scale'

Do you happen to know why that is?
I'm curious to know how you made the official bf16 quants.

Thank you for your time.

I got a little further by installing the git dev release of transformers, but now I'm also facing this error:

https://github.com/ggml-org/llama.cpp/issues/17691

Mistral AI_ org

Hey use BF16 weights it should work better https://huggingface.co/mistralai/Ministral-3-14B-Instruct-2512-BF16

Thank you both!

Doctor-Chad-PhD changed discussion status to closed

Sign up or log in to comment