Cannot load finetuned model due to changes in *_RW.py files
#6
by
marcovirgolin
- opened
Hi there.
I finetuned the falcon model prior to the changes to the configuration and modelling files (from *_RW.py to *_falcon.py).
Now, when I try to load my local model, I get:Could not locate the configuration_RW.py inside tiiuae/falcon-rw-1b.
I tried passing the revision
field, pointing to the commit hash that preceeds the changes, with:model = AutoModelForCausalLM.from_pretrained( local_path, trust_remote_code=True, revision="7fb349a0a7b09213458a45a2861342c7f2d2d3fc")
But I still get that error. I see, via debugging, that revision==None
as it goes through the transformers library code.
Can you help me with this? I'd be great if backward compatibility is not lost.