Transformer Issue ?

#1
by jgsmcmahon - opened

Please will you advise on the issue :

The checkpoint you are trying to load has model type idefics3 but Transformers does not recognize this architecture.

I am using
Name: transformers
Version: 4.44.0

With GPU RTX 3090
NVIDIA-SMI 535.104.05 Driver Version: 535.104.05 CUDA Version: 12.2
Kind regards

You need the dev version and merge a pull request into it if you want to test the model right now as stated on the Model card at the top:

Transformers version: until the next Transformers pypi release, please install Transformers from source and use this PR to be able to use Idefics3. TODO: change when new version.

If done correctly the model works for inference (and I'm currently testing Lora finetuning. qLora didn't work (instantly OOM).

HuggingFaceM4 org

Thanks @ayyylemao , note that @merve is also working on preparing a QLora fine-tuning, that she will release when it's ready

HuggingFaceM4 org

Try the following.

git clone https://github.com/andimarafioti/transformers.git
cd transformers
git checkout idefics3
pip install -q "."
cd ..

Sign up or log in to comment