Context length
#1
by
mrfakename
- opened
What’s the context length?
8k, with a theoretical attention span of 128K tokens (from Mistral-7b)
What’s the context length?
8k, with a theoretical attention span of 128K tokens (from Mistral-7b)