Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
afrideva
/
Mixtral-GQA-400m-v2-GGUF
like
1
Text Generation
GGUF
English
ggml
quantized
q2_k
q3_k_m
q4_k_m
q5_k_m
q6_k
q8_0
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Use this model
main
Mixtral-GQA-400m-v2-GGUF
1 contributor
History:
9 commits
afrideva
Upload README.md with huggingface_hub
b460d08
11 months ago
.gitattributes
Safe
1.99 kB
Upload mixtral-gqa-400m-v2.q8_0.gguf with huggingface_hub
11 months ago
README.md
Safe
2.34 kB
Upload README.md with huggingface_hub
11 months ago
mixtral-gqa-400m-v2.fp16.gguf
Safe
4.01 GB
LFS
Upload mixtral-gqa-400m-v2.fp16.gguf with huggingface_hub
11 months ago
mixtral-gqa-400m-v2.q2_k.gguf
Safe
703 MB
LFS
Upload mixtral-gqa-400m-v2.q2_k.gguf with huggingface_hub
11 months ago
mixtral-gqa-400m-v2.q3_k_m.gguf
Safe
900 MB
LFS
Upload mixtral-gqa-400m-v2.q3_k_m.gguf with huggingface_hub
11 months ago
mixtral-gqa-400m-v2.q4_k_m.gguf
Safe
1.15 GB
LFS
Upload mixtral-gqa-400m-v2.q4_k_m.gguf with huggingface_hub
11 months ago
mixtral-gqa-400m-v2.q5_k_m.gguf
Safe
1.39 GB
LFS
Upload mixtral-gqa-400m-v2.q5_k_m.gguf with huggingface_hub
11 months ago
mixtral-gqa-400m-v2.q6_k.gguf
Safe
1.65 GB
LFS
Upload mixtral-gqa-400m-v2.q6_k.gguf with huggingface_hub
11 months ago
mixtral-gqa-400m-v2.q8_0.gguf
Safe
2.13 GB
LFS
Upload mixtral-gqa-400m-v2.q8_0.gguf with huggingface_hub
11 months ago