Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
mradermacher
/
Hermes-2-Pro-Llama-3-70B-i1-GGUF
like
0
Transformers
GGUF
teknium/OpenHermes-2.5
English
Llama-3
instruct
finetune
chatml
DPO
RLHF
gpt4
synthetic data
distillation
function calling
json mode
axolotl
Inference Endpoints
imatrix
conversational
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
d539584
Hermes-2-Pro-Llama-3-70B-i1-GGUF
1 contributor
History:
27 commits
mradermacher
uploaded from nethype/kaos
d539584
verified
4 months ago
.gitattributes
3.25 kB
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ1_M.gguf
16.8 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ1_S.gguf
15.3 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ2_M.gguf
24.1 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ2_S.gguf
22.2 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ2_XS.gguf
21.1 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ2_XXS.gguf
19.1 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ3_M.gguf
31.9 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ3_S.gguf
30.9 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ3_XS.gguf
29.3 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ3_XXS.gguf
27.5 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-IQ4_XS.gguf
37.9 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q2_K.gguf
26.4 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q3_K_L.gguf
37.1 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q3_K_M.gguf
34.3 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q3_K_S.gguf
30.9 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q4_0.gguf
40.1 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q4_K_M.gguf
42.5 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q4_K_S.gguf
40.3 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q5_K_M.gguf
49.9 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q5_K_S.gguf
48.7 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q6_K.gguf.part1of2
29 GB
LFS
uploaded from nethype/kaos
4 months ago
Hermes-2-Pro-Llama-3-70B.i1-Q6_K.gguf.part2of2
28.9 GB
LFS
uploaded from nethype/kaos
4 months ago
README.md
5.04 kB
auto-patch README.md
4 months ago
imatrix.dat
24.9 MB
LFS
uploaded from nethype/kaos
4 months ago