OPT has `max_embedding_size` 2050
#3
by
TimeRobber
- opened
If you check that torch file, and load the tensors, you'll notice that embedding sizes are actually 2050 instead of 2048. Should be update that value in the config?
I think this is due to the offset of position embeddings: https://github.com/huggingface/transformers/blob/d0acc9537829e7d067edbb791473bbceb2ecf056/src/transformers/models/opt/modeling_opt.py#L96-L97
So 2048 should be correct.
TimeRobber
changed discussion status to
closed