Deberta max length

#12
by dataminer1 - opened

inputs=tokenizer(text,
add_special_tokens=True,
max_length=1024,
padding='max_length',
truncation=True)
If i give max length=1024 it takes in and doesnt throw error even though max positional embeddings is 512 so the model can take any size? The model runs perfectly though

Hi
can we increase max_length to say 2048 if we increase positional embedding to 2048?? is it possible. or is it possible to increase max_length by finetuning model on larger context size?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment