Quantized GLM-4 9b q4_0

Quantization based on ChatGLM.CPP convert.py script.

Download: chatglm4-ggml-int4.bin

Download in shell CLI

https://huggingface.co/npc0/chatglm-4-9b-int4/resolve/main/chatglm4-ggml-int4.bin

协议

GLM-4 模型的权重的使用则需要遵循 LICENSE

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Space using npc0/chatglm-4-9b-int4 1

Collection including npc0/chatglm-4-9b-int4