jartine commited on
Commit
9693476
1 Parent(s): 27679d8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -66,7 +66,7 @@ Command template:
66
  ```
67
 
68
  The maximum context size of this model is 32768 tokens. These llamafiles
69
- use a default context size of 512 tokens. Whenever you need the maximum
70
  context size to be available with llamafile for any given model, you can
71
  pass the `-c 0` flag. The default temperature for these llamafiles is
72
  0.8 because it helps for this model. It can be tuned, e.g. `--temp 0`.
 
66
  ```
67
 
68
  The maximum context size of this model is 32768 tokens. These llamafiles
69
+ use a default context size of 4096 tokens. Whenever you need the maximum
70
  context size to be available with llamafile for any given model, you can
71
  pass the `-c 0` flag. The default temperature for these llamafiles is
72
  0.8 because it helps for this model. It can be tuned, e.g. `--temp 0`.