config.json is missing
config.json is missing which makes it impossible to use models with vLLM
this is a GGUF model, that's what makes it impossible to use with vLLM.
That's not what the vllm doc seems to say:
See : https://docs.vllm.ai/en/stable/quantization/supported_hardware.html
and : https://docs.vllm.ai/en/latest/getting_started/examples/gguf_inference.html
But maybe I miss understand a subtlety...
well shit maybe you're right, i'll look into this, sorry for dismissing it!
Just to inform an issue was opened on the vLLM project. See : https://github.com/vllm-project/vllm/issues/4416#issuecomment-2316593886
Hoping it could help
yeah would be nice if they supported a lack of config.json, otherwise i think i could look into adding them..
File "/home/pcarceller/softs/anaconda3/envs/llmdoc/lib/python3.10/site-packages/transformers/utils/hub.py", line 456, in cached_file
raise EnvironmentError(
OSError: bartowski/aya-23-8B-GGUF does not appear to have a file named config.json.
Same kind of error when trying to use it with transformers