Update README.md
Browse files
README.md
CHANGED
@@ -66,7 +66,7 @@ Command template:
|
|
66 |
```
|
67 |
|
68 |
The maximum context size of this model is 32768 tokens. These llamafiles
|
69 |
-
use a default context size of
|
70 |
context size to be available with llamafile for any given model, you can
|
71 |
pass the `-c 0` flag. The default temperature for these llamafiles is
|
72 |
0.8 because it helps for this model. It can be tuned, e.g. `--temp 0`.
|
|
|
66 |
```
|
67 |
|
68 |
The maximum context size of this model is 32768 tokens. These llamafiles
|
69 |
+
use a default context size of 4096 tokens. Whenever you need the maximum
|
70 |
context size to be available with llamafile for any given model, you can
|
71 |
pass the `-c 0` flag. The default temperature for these llamafiles is
|
72 |
0.8 because it helps for this model. It can be tuned, e.g. `--temp 0`.
|