Problem in running with vllm

#4
by babakgh - opened

Hi
Command vllm serve "bartowski/Llama-3.3-70B-Instruct-GGUF" returns following error. how to fix this?

raise ValueError(f"No supported config format found in {model}")
ValueError: No supported config format found in bartowski/Llama-3.3-70B-Instruct-GGUF

i think you need to add a config.json file with:

{
    "model_type": "llama"
}

can you confirm if this works?

Sign up or log in to comment