Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
some1nostr
/
Ostrich-70B
like
5
Text Generation
GGUF
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
0ec56d6
Ostrich-70B
1 contributor
History:
137 commits
some1nostr
Upload ostrich-70b-9320-Q5_K_M.gguf
0ec56d6
verified
8 months ago
.gitattributes
Safe
6.63 kB
Upload ostrich-70b-9320-Q5_K_M.gguf
8 months ago
README.md
Safe
1.57 kB
Update README.md
8 months ago
ostrich-70b-7314-Q5_K_M.gguf
Safe
50 GB
LFS
Upload ostrich-70b-7314-Q5_K_M.gguf
8 months ago
ostrich-70b-7914-q8_0-00001-of-00002.gguf
Safe
43 GB
LFS
Upload ostrich-70b-7914-q8_0-00001-of-00002.gguf
8 months ago
ostrich-70b-7914-q8_0-00002-of-00002.gguf
Safe
32 GB
LFS
Upload ostrich-70b-7914-q8_0-00002-of-00002.gguf
8 months ago
ostrich-70b-8314-Q6_K-00001-of-00002.gguf
Safe
42.8 GB
LFS
Upload 2 files
8 months ago
ostrich-70b-8314-Q6_K-00002-of-00002.gguf
Safe
15.1 GB
LFS
Upload 2 files
8 months ago
ostrich-70b-8914-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-8914-Q4_0.gguf
8 months ago
ostrich-70b-9320-IQ1_S.gguf
Safe
15.3 GB
LFS
Upload ostrich-70b-9320-IQ1_S.gguf
8 months ago
ostrich-70b-9320-IQ2_XS.gguf
Safe
21.1 GB
LFS
Upload ostrich-70b-9320-IQ2_XS.gguf
8 months ago
ostrich-70b-9320-IQ3_XS.gguf
Safe
29.3 GB
LFS
Upload ostrich-70b-9320-IQ3_XS.gguf
8 months ago
ostrich-70b-9320-IQ4_XS.gguf
Safe
37.9 GB
LFS
Upload ostrich-70b-9320-IQ4_XS.gguf
8 months ago
ostrich-70b-9320-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-9320-Q4_0.gguf
8 months ago
ostrich-70b-9320-Q5_K_M.gguf
Safe
50 GB
LFS
Upload ostrich-70b-9320-Q5_K_M.gguf
8 months ago
ostrich-70b-9320-q8_0-00001-of-00002.gguf
Safe
43 GB
LFS
Upload 2 files
8 months ago
ostrich-70b-9320-q8_0-00002-of-00002.gguf
Safe
32 GB
LFS
Upload 2 files
8 months ago