Getting size mismatch error while loading the peft modal
RuntimeError: Error(s) in loading state_dict for PeftModelForCausalLM:
size mismatch for base_model.model.transformer.h.0.self_attention.query_key_value.lora_A.default.weight: copying a param with shape torch.Size([32, 4096]) from checkpoint, the shape in current model is torch.Size([16, 4096]).
size mismatch for base_model.model.transformer.h.0.self_attention.query_key_value.lora_B.default.weight: copying a param with shape torch.Size([8192, 16, 1]) from checkpoint, the shape in current model is torch.Size([12288, 16]).
size mismatch for base_model.model.transformer.h.1.self_attention.query_key_value.lora_A.default.weight: copying a param with shape torch.Size([32, 4096]) from checkpoint, the shape in current model is torch.Size([16, 4096]).
size mismatch for base_model.model.transformer.h.1.self_attention.query_key_value.lora_B.default.weight: copying a param with shape torch.Size([8192, 16, 1]) from checkpoint, the shape in current model is torch.Size([12288, 16]).
size mismatch for base_model.model.transformer.h.2.self_attention.query_key_value.lora_A.default.weight: copying a param