Edit model card

通过 IQ4_XS 量化获得的 gguf 格式模型。原模型来自 https://huggingface.co/Classical/Yinka。

Downloads last month
75
GGUF
Model size
324M params
Architecture
bert

4-bit

Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for HiDolen/Classical-Yinka-IQ4_XS-gguf

Base model

Classical/Yinka
Quantized
(1)
this model