Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
akx
/
Poro-34B-gguf
like
3
GGUF
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Deploy
Use this model
6065d77
Poro-34B-gguf
2 contributors
History:
7 commits
akx
1000B
6065d77
verified
9 months ago
.gitattributes
1.69 kB
Upload ggml-model-Q5_K.gguf with huggingface_hub
about 1 year ago
README.md
494 Bytes
1000B
9 months ago
ggml-model-Q3_K.gguf
18.6 GB
LFS
1000B
9 months ago
ggml-model-Q4_K.gguf
22.4 GB
LFS
1000B
9 months ago
ggml-model-Q5_K.gguf
26.1 GB
LFS
1000B
9 months ago