I can't loat this model in two ways(does not appear to have a file named preprocessor_config.json.&Unrecognized configuration class: AutoModelForCausalLM.)
Load model directly
from transformers import AutoProcessor, AutoModelForCausalLM
processor = AutoProcessor.from_pretrained("wisdomik/Quilt-Llava-v1.5-7b")
model = AutoModelForCausalLM.from_pretrained("wisdomik/Quilt-Llava-v1.5-7b")
result:
EntryNotFoundError: 404 Client Error. (Request ID: Root=1-6703d1be-5df2e1e0074cfe3e4006e3bc;22c223b3-0748-4404-bdd2-0165de0f047b)
Entry Not Found for url: https://hf-mirror.com/wisdomik/Quilt-Llava-v1.5-7b/resolve/main/preprocessor_config.json.
The above exception was the direct cause of the following exception:
OSError: wisdomik/Quilt-Llava-v1.5-7b does not appear to have a file named preprocessor_config.json. Checkout 'https://huggingface.co/wisdomik/Quilt-Llava-v1.5-7b/tree/main' for available files.
Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="wisdomik/Quilt-Llava-v1.5-7b")
result:
ValueError: Could not load model wisdomik/Quilt-Llava-v1.5-7b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>,). See the original errors:
while loading with AutoModelForCausalLM, an error is thrown:
Traceback (most recent call last):
File "xxxxxx/python3.11/site-packages/transformers/pipelines/base.py", line 288, in infer_framework_load_model
model = model_class.from_pretrained(model, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data4/liuxingyu/.conda/envs/lxy/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 567, in from_pretrained
raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers.models.llava.configuration_llava.LlavaConfig'> for this kind of AutoModel: AutoModelForCausalLM.
what should i do to load and use this model ? Thanks.