Context window
#9
by
MeleeMech
- opened
I've been reading through the literature and can't find an answer for this. How do I determine the context window size of this model? 4K tokens? Additionally is there a way to determine the context window of other models as well? Forgive me if this is a newbie question.
I think its 2000 tokens:
https://huggingface.co/NousResearch/Nous-Hermes-13b/blob/main/tokenizer_config.json#L21
Thanks, I noticed that line in the code but the model behaved in way that seemed to consider more than 2k tokens worth of input when I tested it.