Context length
#9
by
nedrad88s
- opened
Thanks for the model!
What is the model's context length?
In config.jsonL
''max_positional_embeddings" = 4096
Is it the context length?
Thanks!
No problem!
We SFTed the model given a max sequence length of 2048 tokens.
It is possible that the model can perform beyond that sequence length (up to 4096 tokens), but we haven't really performed any tests in that regard.
Thanks for the quick response!
nedrad88s
changed discussion status to
closed