Can't find 'adapter_config.json' at 'IlyaGusev/saiga_llama3_8b'
#2
by
SergeyOvchinnikov
- opened
Hello Ilya!
First of all, thank you for your work and for the saiga model!
I am trying to run it with PeftConfig but it fails when tries to download the "adapter_config.json" file.
There is a such file in the saiga2 files but I don't see such in saiga3.
My code:
model_base = AutoModelForCausalLM.from_pretrained(
base_model_name, # here I have meta-llama/Meta-Llama-3-8B-Instruct
torch_dtype=torch.float16,
device_map="auto",
token=auth_token,
config=generation_config)
peft_config = PeftConfig.from_pretrained(model_name) # this line fails
model = PeftModel.from_pretrained(
model_base,
model_name, # here I have IlyaGusev/saiga_llama3_8b
torch_dtype=torch.float16,
config=peft_config)
Please advise.
HI! This repo contains only the final merged model. I don't publish adapters this time.
So you should just use AutoModelForCausalLM.from_pretrained("IlyaGusev/saiga_llama3_8b", ...)
IlyaGusev
changed discussion status to
closed