Found keys that are not in the model state dict but in the checkpoint: ['encoder.model.embeddings.position_ids']

#7
by RuochenZhang - opened

I got an issue with running this model, issue message is:

PS C:\Users\v-ruozha> & C:/Users/v-ruozha/AppData/Local/Programs/Python/Python311/python.exe c:/Users/v-ruozha/Desktop/en_zh_data/en_zh_CometSrc.py
Lightning automatically upgraded your loaded checkpoint from v1.8.2 to v2.2.3. To apply the upgrade to your files permanently, run python -m pytorch_lightning.utilities.upgrade_checkpoint C:\Users\v-ruozha\.cache\huggingface\hub\models--Unbabel--wmt22-cometkiwi-da\snapshots\b3a8aea5a5fc22db68a554b92b3d96eb6ea75cc9\checkpoints\model.ckpt
Encoder model frozen.
C:\Users\v-ruozha\AppData\Local\Programs\Python\Python311\Lib\site-packages\pytorch_lightning\core\saving.py:188: Found keys that are not in the model state dict but in the checkpoint: ['encoder.model.embeddings.position_ids']

I already login the hugging face with command, but still got this issue

Unbabel org

This is just a warning right? everything still runs?

If so you can ignore it. This has to do with some changes between HF transformer versions. The version which was used to train the model had a slightly different name for the positional embeddings. I tested both using the previous transformers version and using the new one and results are exactly the same. Note that most COMET models the embedding layers are frozen during training and they do not change... Thats the case for wmt22-cometkiwi-da. There is only an exception with XCOMET-XXL but even on that one I was not able to find any difference in results with this warning.

The other warning you get is also similar. Pytorch Lightning used to follow a different structure to "save" checkpoints but this is not breaking changes and they convert internally from one structure to the other.

RicardoRei changed discussion status to closed

Sign up or log in to comment