Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
some1nostr
/
Ostrich-70B
like
5
Text Generation
GGUF
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
b262d1c
Ostrich-70B
1 contributor
History:
160 commits
some1nostr
Delete ostrich-70b-9406-Q4_0.gguf
b262d1c
verified
6 months ago
.gitattributes
Safe
7.33 kB
Upload ostrich-70b-9880-IQ1_S.gguf
6 months ago
README.md
Safe
2.12 kB
Update README.md
6 months ago
ostrich-70b-8314-Q6_K-00001-of-00002.gguf
Safe
42.8 GB
LFS
Upload 2 files
6 months ago
ostrich-70b-8314-Q6_K-00002-of-00002.gguf
Safe
15.1 GB
LFS
Upload 2 files
6 months ago
ostrich-70b-9320-IQ4_XS.gguf
Safe
37.9 GB
LFS
Upload ostrich-70b-9320-IQ4_XS.gguf
6 months ago
ostrich-70b-9320-Q5_K_M.gguf
Safe
50 GB
LFS
Upload ostrich-70b-9320-Q5_K_M.gguf
6 months ago
ostrich-70b-9320-q8_0-00001-of-00002.gguf
Safe
43 GB
LFS
Upload 2 files
6 months ago
ostrich-70b-9320-q8_0-00002-of-00002.gguf
Safe
32 GB
LFS
Upload 2 files
6 months ago
ostrich-70b-9638-IQ1_S.gguf
Safe
15.3 GB
LFS
Upload ostrich-70b-9638-IQ1_S.gguf
6 months ago
ostrich-70b-9638-IQ2_XS.gguf
Safe
21.1 GB
LFS
Rename ostrich-70b-9638-iq2xs.gguf to ostrich-70b-9638-IQ2_XS.gguf
6 months ago
ostrich-70b-9638-IQ3_XS.gguf
Safe
29.3 GB
LFS
Upload ostrich-70b-9638-IQ3_XS.gguf
6 months ago
ostrich-70b-9638-Q4_0.gguf
Safe
40 GB
LFS
Rename ostrich-70b-9638-q4.gguf to ostrich-70b-9638-Q4_0.gguf
6 months ago
ostrich-70b-9880-IQ1_S.gguf
Safe
15.3 GB
LFS
Upload ostrich-70b-9880-IQ1_S.gguf
6 months ago
ostrich-70b-9880-Q4_0.gguf
Safe
40 GB
LFS
Upload ostrich-70b-9880-Q4_0.gguf
6 months ago