Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
qwp4w3hyb
/
codegeex4-all-9b-iMat-GGUF
like
0
GGUF
Inference Endpoints
Model card
Files
Files and versions
Community
Deploy
Use this model
main
codegeex4-all-9b-iMat-GGUF
1 contributor
History:
2 commits
qwp4w3hyb
Upload folder using huggingface_hub
a9b2d22
verified
5 months ago
.gitattributes
2.2 kB
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-bf16.gguf
18.8 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-imat-IQ1_S.gguf
3.1 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-imat-IQ2_XXS.gguf
3.43 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-imat-IQ3_XXS.gguf
4.26 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-imat-IQ4_XS.gguf
5.25 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-imat-Q4_K_L.gguf
7.88 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-imat-Q5_K_L.gguf
8.69 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-imat-Q6_K_L.gguf
9.73 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b-imat-Q8_0_L.gguf
11.2 GB
LFS
Upload folder using huggingface_hub
5 months ago
codegeex4-all-9b.imatrix
4.16 MB
LFS
Upload folder using huggingface_hub
5 months ago