The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed
while loading the model I am facing this issue
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py in register(cls, config_class, model_class, exist_ok)
534 """
535 if hasattr(model_class, "config_class") and model_class.config_class != config_class:
--> 536 raise ValueError(
537 "The model class you are passing has a config_class
attribute that is not consistent with the "
538 f"config class you passed (model has {model_class.config_class} and you passed {config_class}. Fix "
ValueError: The model class you are passing has a config_class
attribute that is not consistent with the config class you passed (model has <class 'transformers.models.bert.configuration_bert.BertConfig'> and you passed <class 'transformers_modules.zhihan1996.DNABERT-2-117M.81ac6a98387cf94bc283553260f3fa6b88cef2fa.configuration_bert.BertConfig'>. Fix one of those so they match!
I encountered this error at first, and then was able to resolve it by using transformers version 4.29 or 4.28.
You may also be able to solve it by loading the config first (but I haven't tried that). See here: https://github.com/Zhihan1996/DNABERT_2/issues/22
Thank you @meganbkratz the above problem was resolved by using transformer version 4.29.
while loading the model I am facing this issue
OSError: Go4miii/DISC-FinLLM does not appear to have a file named baichuan-inc/Baichuan-13B-Chat--modeling_baichuan.py. Checkout 'https://huggingface.co/Go4miii/DISC-FinLLM/main' for available files.
I load the model with the following code to avoid downgrading my transformers library (not happening!)
from transformers.models.bert.configuration_bert import BertConfig
config = BertConfig.from_pretrained("zhihan1996/DNABERT-2-117M")
model = AutoModel.from_pretrained("zhihan1996/DNABERT-2-117M", trust_remote_code=True, config=config)
Let me know if it works for you :)