Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
some1nostr
/
Ostrich-70B
like
5
Text Generation
GGUF
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
c898cc7
Ostrich-70B
1 contributor
History:
126 commits
some1nostr
Upload ostrich-70b-9320-IQ2_XS.gguf
c898cc7
verified
6 months ago
.gitattributes
Safe
6.22 kB
Upload ostrich-70b-9320-IQ2_XS.gguf
6 months ago
README.md
Safe
1.57 kB
Update README.md
6 months ago
ostrich-70b-7314-Q5_K_M.gguf
Safe
50 GB
LFS
Upload ostrich-70b-7314-Q5_K_M.gguf
6 months ago
ostrich-70b-7914-iq1s.gguf
Safe
15.3 GB
LFS
Upload ostrich-70b-7914-iq1s.gguf
6 months ago
ostrich-70b-7914-iq3xs.gguf
Safe
29.3 GB
LFS
Upload ostrich-70b-7914-iq3xs.gguf
6 months ago
ostrich-70b-7914-q8_0-00001-of-00002.gguf
Safe
43 GB
LFS
Upload ostrich-70b-7914-q8_0-00001-of-00002.gguf
6 months ago
ostrich-70b-7914-q8_0-00002-of-00002.gguf
Safe
32 GB
LFS
Upload ostrich-70b-7914-q8_0-00002-of-00002.gguf
6 months ago
ostrich-70b-8014-iq2xs.gguf
Safe
21.1 GB
LFS
Upload ostrich-70b-8014-iq2xs.gguf
6 months ago
ostrich-70b-8014-iq4xs.gguf
Safe
37.9 GB
LFS
Upload ostrich-70b-8014-iq4xs.gguf
6 months ago
ostrich-70b-8314-Q6_K-00001-of-00002.gguf
Safe
42.8 GB
LFS
Upload 2 files
6 months ago
ostrich-70b-8314-Q6_K-00002-of-00002.gguf
Safe
15.1 GB
LFS
Upload 2 files
6 months ago
ostrich-70b-8614-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-8614-Q4_0.gguf
6 months ago
ostrich-70b-8914-IQ1_S.gguf
Safe
15.3 GB
LFS
Upload ostrich-70b-8914-IQ1_S.gguf
6 months ago
ostrich-70b-8914-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-8914-Q4_0.gguf
6 months ago
ostrich-70b-9320-IQ1_S.gguf
Safe
15.3 GB
LFS
Upload ostrich-70b-9320-IQ1_S.gguf
6 months ago
ostrich-70b-9320-IQ2_XS.gguf
Safe
21.1 GB
LFS
Upload ostrich-70b-9320-IQ2_XS.gguf
6 months ago