Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
dranger003
/
c4ai-command-r-plus-iMat.GGUF
like
137
Text Generation
GGUF
Inference Endpoints
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
19
Deploy
Use this model
335e6f4
c4ai-command-r-plus-iMat.GGUF
2 contributors
History:
108 commits
dranger003
Upload ggml-c4ai-command-r-plus-104b-f16-00004-of-00005.gguf with huggingface_hub
335e6f4
verified
7 months ago
.gitattributes
5.04 kB
Upload ggml-c4ai-command-r-plus-104b-f16-00004-of-00005.gguf with huggingface_hub
7 months ago
README.md
3.38 kB
Update README.md
7 months ago
ggml-c4ai-command-r-plus-104b-f16-00001-of-00005.gguf
48.4 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-f16-00001-of-00005.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-f16-00002-of-00005.gguf
42.9 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-f16-00002-of-00005.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-f16-00003-of-00005.gguf
42.1 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-f16-00003-of-00005.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-f16-00004-of-00005.gguf
42.9 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-f16-00004-of-00005.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-f16-imatrix.dat
27.5 MB
LFS
Upload ggml-c4ai-command-r-plus-104b-f16-imatrix.dat with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq1_m.gguf
25.2 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq1_m.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq1_s.gguf
23.2 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq1_s.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq2_m.gguf
36 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq2_m.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq2_s.gguf
33.3 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq2_s.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq2_xs.gguf
31.6 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq2_xs.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq2_xxs.gguf
28.6 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq2_xxs.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq3_m.gguf
47.7 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq3_m.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq3_s.gguf
46 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq3_s.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq3_xs.gguf
43.6 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq3_xs.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq3_xxs.gguf
40.7 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq3_xxs.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq4_xs-00001-of-00002.gguf
48.7 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq4_xs-00001-of-00002.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-iq4_xs-00002-of-00002.gguf
7.55 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-iq4_xs-00002-of-00002.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-q5_k_s-00001-of-00002.gguf
48.3 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q5_k_s-00001-of-00002.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-q5_k_s-00002-of-00002.gguf
23.5 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q5_k_s-00002-of-00002.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-q6_k-00001-of-00002.gguf
49 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q6_k-00001-of-00002.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-q6_k-00002-of-00002.gguf
36.1 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q6_k-00002-of-00002.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-q8_0-00001-of-00003.gguf
48.9 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q8_0-00001-of-00003.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-q8_0-00002-of-00003.gguf
46.3 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q8_0-00002-of-00003.gguf with huggingface_hub
7 months ago
ggml-c4ai-command-r-plus-104b-q8_0-00003-of-00003.gguf
15.1 GB
LFS
Upload ggml-c4ai-command-r-plus-104b-q8_0-00003-of-00003.gguf with huggingface_hub
7 months ago