Error when using VLLM as per original Open Chat instructions
#2
by
jtsaint346
- opened
Entry Not Found for url: https://huggingface.co/NurtureAI/openchat_3.5-16k/resolve/main/openchat.json.
sorry about your trouble unfortunately i didn't make vllm so im not even sure what an openchat.json is.
Oh sorry. It's because the model card part describes how to run and I couldn't get it to work. I guess it's a clone of the original.
Was very keen on seeing this with long context but just can't work out what to do.
Open chat made some useful changes to embed vllm
ill check em out if i figure it out ill let you know, i still want to help just dont know much about it. have u tried the 16k awq version that thebloke made from this model?