Error when I load the model
Hello,
I don't know why the model was working well two months ago but when I try to use it again there is this error message (simply loading the model) :
from transformers import AutoModelForMaskedLM
model = AutoModelForMaskedLM.from_pretrained("kiddothe2b/hierarchical-transformer-base-4096", trust_remote_code=True)
ValueError Traceback (most recent call last)
in <cell line: 3>()
1 # Load model directly
2 from transformers import AutoModelForMaskedLM
----> 3 model = AutoModelForMaskedLM.from_pretrained("kiddothe2b/hierarchical-transformer-base-4096", trust_remote_code=True)
1 frames
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
485 model_class.register_for_auto_class(cls.name)
486 else:
--> 487 cls.register(config.class, model_class, exist_ok=True)
488 return model_class.from_pretrained(
489 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py in register(cls, config_class, model_class, exist_ok)
511 """
512 if hasattr(model_class, "config_class") and model_class.config_class != config_class:
--> 513 raise ValueError(
514 "The model class you are passing has a config_class
attribute that is not consistent with the "
515 f"config class you passed (model has {model_class.config_class} and you passed {config_class}. Fix "
ValueError: The model class you are passing has a config_class
attribute that is not consistent with the config class you passed (model has <class 'transformers_modules.kiddothe2b.hierarchical-transformer-base-4096.9638d60e28ef77b8794dccc73969e9155271ff35.modelling_hat.HATConfig'> and you passed <class 'transformers_modules.kiddothe2b.hierarchical-transformer-base-4096.9638d60e28ef77b8794dccc73969e9155271ff35.configuration_hat.HATConfig'>. Fix one of those so they match!
hi
@mathiasmarciano
doc_classifier = AutoModelForSequenceClassification.from_pretrained("kiddothe2b/hierarchical-transformer-base-4096", trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data/Prasanthi/myenv/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 555, in from_pretrained
cls.register(config.class, model_class, exist_ok=True)
File "/data/Prasanthi/myenv/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 581, in register
raise ValueError(
ValueError: The model class you are passing has a config_class
attribute that is not consistent with the config class you passed (model has <class 'transformers_modules.kiddothe2b.hierarchical-transformer-base-4096.1578b2b5e0d8ad6f79f07ad13b27513db8ffe99e.modelling_hat.HATConfig'> and you passed <class 'transformers_modules.kiddothe2b.hierarchical-transformer-base-4096.1578b2b5e0d8ad6f79f07ad13b27513db8ffe99e.configuration_hat.HATConfig'>. Fix one of those so they match!
I was able to resolve it by using transformers version 4.29 or 4.28.