Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
some1nostr
/
Ostrich-70B
like
5
Text Generation
GGUF
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
6bc1e75
Ostrich-70B
1 contributor
History:
118 commits
some1nostr
Upload ostrich-70b-8614-Q4_0.gguf
6bc1e75
verified
6 months ago
.gitattributes
Safe
5.96 kB
Upload ostrich-70b-8614-Q4_0.gguf
6 months ago
README.md
Safe
1.57 kB
Update README.md
6 months ago
ostrich-70b-7314-Q5_K_M.gguf
Safe
50 GB
LFS
Upload ostrich-70b-7314-Q5_K_M.gguf
6 months ago
ostrich-70b-7914-iq1s.gguf
Safe
15.3 GB
LFS
Upload ostrich-70b-7914-iq1s.gguf
6 months ago
ostrich-70b-7914-iq3xs.gguf
Safe
29.3 GB
LFS
Upload ostrich-70b-7914-iq3xs.gguf
6 months ago
ostrich-70b-7914-q8_0-00001-of-00002.gguf
Safe
43 GB
LFS
Upload ostrich-70b-7914-q8_0-00001-of-00002.gguf
6 months ago
ostrich-70b-7914-q8_0-00002-of-00002.gguf
Safe
32 GB
LFS
Upload ostrich-70b-7914-q8_0-00002-of-00002.gguf
6 months ago
ostrich-70b-8014-iq2xs.gguf
Safe
21.1 GB
LFS
Upload ostrich-70b-8014-iq2xs.gguf
6 months ago
ostrich-70b-8014-iq4xs.gguf
Safe
37.9 GB
LFS
Upload ostrich-70b-8014-iq4xs.gguf
6 months ago
ostrich-70b-8064-q4.gguf
Safe
40 GB
LFS
Upload ostrich-70b-8064-q4.gguf
6 months ago
ostrich-70b-8114-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-8114-Q4_0.gguf
6 months ago
ostrich-70b-8314-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-8314-Q4_0.gguf
6 months ago
ostrich-70b-8314-Q6_K-00001-of-00002.gguf
Safe
42.8 GB
LFS
Upload 2 files
6 months ago
ostrich-70b-8314-Q6_K-00002-of-00002.gguf
Safe
15.1 GB
LFS
Upload 2 files
6 months ago
ostrich-70b-8464-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-8464-Q4_0.gguf
6 months ago
ostrich-70b-8614-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-8614-Q4_0.gguf
6 months ago