error when using `.save_pretrained` due to the non-contiguous tensor
I attempted to load my checkpoint model using AutoModelForCausalLM.from_pretrained. I then merged and unloaded it. However, when I tried to save it using model.save_pretrained(output_merged_dir, safe_serialization=True), an error occurred.
Here is my code snippet:
from peft import LoraConfig, get_peft_model, prepare_model_for_kbit_training, AutoPeftModelForCausalLM, PeftModel
import torch
model = AutoPeftModelForCausalLM.from_pretrained("./final_checkpoint", device_map="auto", torch_dtype=torch.bfloat16, trust_remote_code=True)
model = model.merge_and_unload()
import os
output_merged_dir = "./final_merged_checkpoint"
os.makedirs(output_merged_dir, exist_ok=True)
model.save_pretrained(output_merged_dir, safe_serialization=True)
It shows this error
ValueError: You are trying to save a non contiguous tensor: transformer.h.0.attn.c_attn.weight
which is not allowed. It either means you are trying to save tensors which are reference of each other in which case it's recommended to save only the full tensors, and reslice at load time, or simply call .contiguous()
on your tensor to pack it before saving.