mlabonne commited on
Commit
260e9d2
1 Parent(s): 66ceb1e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -21,6 +21,8 @@ This model is a Mixture of Experts (MoE) made with [mergekit](https://github.com
21
  * [maywell/PiVoT-0.1-Starling-LM-RP](https://huggingface.co/maywell/PiVoT-0.1-Starling-LM-RP)
22
  * [WizardLM/WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1)
23
 
 
 
24
  ## ⚡ Quantized models
25
 
26
  Thanks to TheBloke for the quantized models:
 
21
  * [maywell/PiVoT-0.1-Starling-LM-RP](https://huggingface.co/maywell/PiVoT-0.1-Starling-LM-RP)
22
  * [WizardLM/WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1)
23
 
24
+ The recommended context length is 8k.
25
+
26
  ## ⚡ Quantized models
27
 
28
  Thanks to TheBloke for the quantized models: