how can I use this jartine/wizardcoder-13b-python model?
#3
by
udhugg
- opened
I found this code in a video for the huge Mozilla/WizardCoder-Python-34B-V1.0-llamafile model.
from ctransformers import AutoModelForCausalLM
import os
model_path = os.path.abspath("./wizardcoder-python-13b.llamafile")
llm = AutoModelForCausalLM.from_pretrained(
"jartine/wizardcoder-13b-python",
model_file=model_path,
model_type="llama",
#model_format="gguf",
gpu_layers=50)
print(llm("AI is going to"))
But I get this error:
error loading model: unknown (magic, version) combination: 46715a4d, 273d4470; is this really a GGML file?
It's a zip file. Try extracting the gguf file from inside ./wizardcoder-python-13b.llamafile
.
You can also say:
chmod +x ./wizardcoder-python-13b.llamafile
./wizardcoder-python-13b.llamafile
jartine
changed discussion status to
closed
OK, thanks for your fast help. Great!
I wasn't aware that it's so easy.
I only knew how to execute it.