Error in loading model.
#2
by
iambulb
- opened
Hi! Thanks for this initiative. I tried to load your model but could not.
Seems there is a trivial issue with the configuration.
My code:
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("susnato/phi-2")
Error in short:
Key phi
does not exist in CONFIG_MAPPING
dictionary of file: configuration_auto.py
.
Full error stacktrace:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/path/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1064, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 761, in __getitem__
raise KeyError(key)
KeyError: 'phi'
iambulb
changed discussion title from
Error in Loading model.
to Error in loading model.
Hi
@iambulb
, could you please update your transformers
version and rerun it ?
pip install -U transformers
You must have transformers
version >= 4.36
for it to run.
Let me know if it solves your issue.
Hi. Thanks for the quick reply. I updated and it now works.
I was on the most recent version before 4.36
i.e. 4.35
so I assumed I would not need to update it. But now it makes sense because your model card was updated so recently.
iambulb
changed discussion status to
closed