No module named 'transformers.models.chameleon.configuration_chameleon'

#1
by Hrithik2212 - opened

Installations

!pip install torch==2.0.1 torchvision==0.15.2 transformers==4.37.2 tiktoken==0.6.0 verovio==4.3.1 accelerate==0.28.0

Code

from transformers import AutoModel, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained('srimanth-d/GOT_CPU', trust_remote_code=True)
model = AutoModel.from_pretrained('srimanth-d/GOT_CPU', trust_remote_code=True, low_cpu_mem_usage=True, use_safetensors=True, pad_token_id=tokenizer.eos_token_id)
model = model.eval()

Error

RuntimeError: Failed to import transformers.models.chameleon.configuration_chameleon because of the following error (look up to see its traceback):
No module named 'transformers.models.chameleon.configuration_chameleon'
  1. How to solve this Error and perform inference on CPU
  2. BTW is there any to batch inference on GOT_CPU or the model from the original repo

I am running this on Kaggle CPU notebook

The full error is

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1603, in _get_module(self, module_name)
      0 <Error retrieving source code with stack_data see ipython/ipython#13598>

File /opt/conda/lib/python3.10/importlib/__init__.py:126, in import_module(name, package)
    125         level += 1
--> 126 return _bootstrap._gcd_import(name[level:], package, level)

File <frozen importlib._bootstrap>:1050, in _gcd_import(name, package, level)

File <frozen importlib._bootstrap>:1027, in _find_and_load(name, import_)

File <frozen importlib._bootstrap>:1004, in _find_and_load_unlocked(name, import_)

ModuleNotFoundError: No module named 'transformers.models.chameleon.configuration_chameleon'

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
Cell In[54], line 4
      1 from transformers import AutoModel, AutoTokenizer
      3 tokenizer = AutoTokenizer.from_pretrained('srimanth-d/GOT_CPU', trust_remote_code=True)
----> 4 model = AutoModel.from_pretrained('srimanth-d/GOT_CPU', trust_remote_code=True, low_cpu_mem_usage=True, use_safetensors=True, pad_token_id=tokenizer.eos_token_id)
      5 model = model.eval()

File /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:541, in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
      0 <Error retrieving source code with stack_data see ipython/ipython#13598>

File /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:752, in keys(self)
    750 def _load_attr_from_module(self, model_type, attr):
    751     module_name = model_type_to_module_name(model_type)
--> 752     if module_name not in self._modules:
    753         self._modules[module_name] = importlib.import_module(f".{module_name}", "transformers.models")
    754     return getattribute_from_module(self._modules[module_name], attr)

File /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:753, in <listcomp>(.0)
    751 module_name = model_type_to_module_name(model_type)
    752 if module_name not in self._modules:
--> 753     self._modules[module_name] = importlib.import_module(f".{module_name}", "transformers.models")
    754 return getattribute_from_module(self._modules[module_name], attr)

File /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:749, in _load_attr_from_module(self, model_type, attr)
      0 <Error retrieving source code with stack_data see ipython/ipython#13598>

File /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:693, in getattribute_from_module(module, attr)
    688             result.append(model)
    690     return result
--> 693 def getattribute_from_module(module, attr):
    694     if attr is None:
    695         return None

File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1593, in __getattr__(self, name)
      0 <Error retrieving source code with stack_data see ipython/ipython#13598>

File /opt/conda/lib/python3.10/site-packages/transformers/utils/import_utils.py:1605, in _get_module(self, module_name)
      0 <Error retrieving source code with stack_data see ipython/ipython#13598>

RuntimeError: Failed to import transformers.models.chameleon.configuration_chameleon because of the following error (look up to see its traceback):
No module named 'transformers.models.chameleon.configuration_chameleon'

It seems to run fine when I tested out. Please check.

Sign up or log in to comment