legraphista commited on
Commit
7e0eb2a
1 Parent(s): 65d0406

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -11
README.md CHANGED
@@ -36,7 +36,6 @@ IMatrix dataset: [here](https://gist.githubusercontent.com/bartowski1182/eb213dc
36
  - [All Quants](#all-quants)
37
  - [Downloading using huggingface-cli](#downloading-using-huggingface-cli)
38
  - [Inference](#inference)
39
- - [Simple chat template](#simple-chat-template)
40
  - [Chat template with system prompt](#chat-template-with-system-prompt)
41
  - [Llama.cpp](#llama-cpp)
42
  - [FAQ](#faq)
@@ -110,16 +109,9 @@ huggingface-cli download legraphista/Reflection-Llama-3.1-70B-IMat-GGUF --includ
110
 
111
  ## Inference
112
 
113
- ### Simple chat template
114
- ```
115
- <|begin_of_text|><|start_header_id|>user<|end_header_id|>
116
-
117
- {user_prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
118
-
119
- {assistant_response}<|eot_id|><|start_header_id|>user<|end_header_id|>
120
-
121
- {next_user_prompt}<|eot_id|>
122
- ```
123
 
124
  ### Chat template with system prompt
125
  ```
 
36
  - [All Quants](#all-quants)
37
  - [Downloading using huggingface-cli](#downloading-using-huggingface-cli)
38
  - [Inference](#inference)
 
39
  - [Chat template with system prompt](#chat-template-with-system-prompt)
40
  - [Llama.cpp](#llama-cpp)
41
  - [FAQ](#faq)
 
109
 
110
  ## Inference
111
 
112
+ > [!IMPORTANT]
113
+ > Make sure to set the system prompt:
114
+ > `You are a world-class AI system, capable of complex reasoning and reflection. Reason through the query inside <thinking> tags, and then provide your final response inside <output> tags. If you detect that you made a mistake in your reasoning at any point, correct yourself inside <reflection> tags.`
 
 
 
 
 
 
 
115
 
116
  ### Chat template with system prompt
117
  ```