Peft error in loading an adapter
Greetings,
I get a KeyError error when trying to run PeftModel.from_pretrained(). I hope someone can provide insights. The code and error are below. Thank you.
Code:
model = AutoModelForCausalLM.from_pretrained("haoranxu/X-ALMA-13B-Pretrain", torch_dtype=torch.float16, offload_folder="offload/", device_map="auto") model = PeftModel.from_pretrained(model, f"haoranxu/X-ALMA-13B-Group{group_id}", offload_folder="offload/")
Error:
File "/Users/joelbranch/Python Projects/slm-experimentation/demo.py", line 24, in <module> model = PeftModel.from_pretrained(model, f"haoranxu/X-ALMA-13B-Group{group_id}", offload_folder="offload/") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/slm-experimentation/lib/python3.12/site-packages/peft/peft_model.py", line 581, in from_pretrained load_result = model.load_adapter( ^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/slm-experimentation/lib/python3.12/site-packages/peft/peft_model.py", line 1290, in load_adapter self._update_offload(offload_index, adapters_weights) File "/opt/anaconda3/envs/slm-experimentation/lib/python3.12/site-packages/peft/peft_model.py", line 1121, in _update_offload safe_module = dict(self.named_modules())[extended_prefix] ~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ KeyError: 'base_model.model.model.model.embed_tokens'