Docker image llama.cpp

#1
by ThaiAn - opened

I'm using this llama.cpp docker image "ghcr.io/ggerganov/llama.cpp:server", but i can't infer model. Do you have format prompt?

ThaiAn changed discussion title from Docker infer to Docker image llama.cpp

Sign up or log in to comment