Repository Not Found
Hi I was trying to run the inference example but getting a Repo Not Found
error when trying to load the snapshot:
from huggingface_hub import snapshot_download
model_url = "Apollo-LMMs/Apollo-3B-t32"
model_path = snapshot_download(model_url, repo_type="model")
Repository Not Found for url: https://huggingface.co/api/models/Apollo-LMMs/Apollo-3B-t32/revision/main.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.
Was the repo recently moved? Thanks in advance.
Looks like replacing model url with GoodiesHere/Apollo-LMMs-Apollo-7B-t32
solves the issue, but currently getting the following when trying to load the model:
ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has None and you passed <class 'transformers_modules.6f910e1fb9dc0fe202f83e7815aa40575de73b5c.configuration_apollo.ApolloConfig'>. Fix one of those so they match!
I did some searching at the solutions mentioned here didn't work for me. Any help is appreciated!
Getting the same error,
@zifanwangsteven
. Did you manage to find a solution? Have tried specifying an explicit config and using both a reference to a cloned repo and to the cached model downloaded with snapshot_download
. Haven't been able to make it work.
Looks like replacing model url with
GoodiesHere/Apollo-LMMs-Apollo-7B-t32
solves the issue, but currently getting the following when trying to load the model:ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has None and you passed <class 'transformers_modules.6f910e1fb9dc0fe202f83e7815aa40575de73b5c.configuration_apollo.ApolloConfig'>. Fix one of those so they match!
I did some searching at the solutions mentioned here didn't work for me. Any help is appreciated!
pip install transformers==4.37
worked for me.