Update README.md
Browse files
README.md
CHANGED
@@ -22,6 +22,8 @@ This model was fine-tuned with a fork of FastChat, and therefore uses the standa
|
|
22 |
A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. USER: [prompt] ASSISTANT:
|
23 |
```
|
24 |
|
|
|
|
|
25 |
The most important bit, to me, is the context obedient question answering support, without extensive prompt engineering.
|
26 |
|
27 |
*Note: the example prompt response pairs below are from the 13b model, YMMV with the 7b*
|
|
|
22 |
A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. USER: [prompt] ASSISTANT:
|
23 |
```
|
24 |
|
25 |
+
So in other words, it's the preamble/system prompt, followed by a single space, then "USER: " (single space after colon) then the prompt (which can have multiple lines, spaces, whatever), then a single space, followed by "ASSISTANT: " (with a single space after the colon).
|
26 |
+
|
27 |
The most important bit, to me, is the context obedient question answering support, without extensive prompt engineering.
|
28 |
|
29 |
*Note: the example prompt response pairs below are from the 13b model, YMMV with the 7b*
|