Fix wrong model_max_length
#4
by
andstor
- opened
The model has a context window of 2048 (n_positions
). The tokenizer should also support the same length.
The model has a context window of 2048 (n_positions
). The tokenizer should also support the same length.