Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mradermacher
/
Mixtral-8x22B-v0.1-GGUF
like
1
Transformers
5 languages
Mixture of Experts
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
15d7c52
Mixtral-8x22B-v0.1-GGUF
1 contributor
History:
14 commits
mradermacher
uploaded from nethype/db3
15d7c52
verified
3 months ago
.gitattributes
2.94 kB
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.IQ3_M.gguf.part1of2
33.3 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.IQ3_M.gguf.part2of2
31.2 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.IQ3_S.gguf.part1of2
31.1 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.IQ3_S.gguf.part2of2
30.4 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q2_K.gguf.part1of2
26.8 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q2_K.gguf.part2of2
25.3 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q3_K_M.gguf.part1of2
34.4 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q3_K_M.gguf.part2of2
33.4 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q3_K_S.gguf.part1of2
31.1 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q3_K_S.gguf.part2of2
30.4 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf.part1of2
40.8 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q4_K_S.gguf.part2of2
39.7 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q6_K.gguf.part1of3
38.7 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q6_K.gguf.part2of3
38.7 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q6_K.gguf.part3of3
38.2 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q8_0.gguf.part1of4
37.6 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q8_0.gguf.part2of4
37.6 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q8_0.gguf.part3of4
37.6 GB
LFS
uploaded from nethype/db3
3 months ago
Mixtral-8x22B-v0.1.Q8_0.gguf.part4of4
36.7 GB
LFS
uploaded from nethype/db3
3 months ago
README.md
2.81 kB
auto-patch README.md
3 months ago