gpjt commited on
Commit
ba3848b
1 Parent(s): 65b3ec6

Added missing prompt template to sample code.

Browse files
Files changed (1) hide show
  1. README.md +11 -1
README.md CHANGED
@@ -22,7 +22,17 @@ import time
22
  import torch
23
  from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
24
 
25
- from prompt import prompt_template
 
 
 
 
 
 
 
 
 
 
26
 
27
 
28
  def ask_question(model, tokenizer, question):
 
22
  import torch
23
  from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
24
 
25
+
26
+ prompt_template = """
27
+ <s>[INST] <<SYS>>
28
+ You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.
29
+
30
+ If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
31
+ <</SYS>>
32
+
33
+ {question} [/INST]
34
+ {response}
35
+ """
36
 
37
 
38
  def ask_question(model, tokenizer, question):