fuzzy-mittenz commited on
Commit
54d1366
·
verified ·
1 Parent(s): 16d9390

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +36 -0
README.md CHANGED
@@ -28,6 +28,42 @@ tags:
28
 
29
  ![dolphin qstar.png](https://cdn-uploads.huggingface.co/production/uploads/6593502ca2607099284523db/_-42RD6RGPB-BZ51evsNc.png)
30
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
31
  This model was converted to GGUF format from [`cognitivecomputations/Dolphin3.0-Qwen2.5-3b`](https://huggingface.co/cognitivecomputations/Dolphin3.0-Qwen2.5-3b) using llama.cpp
32
  Refer to the [original model card](https://huggingface.co/cognitivecomputations/Dolphin3.0-Qwen2.5-3b) for more details on the model.
33
 
 
28
 
29
  ![dolphin qstar.png](https://cdn-uploads.huggingface.co/production/uploads/6593502ca2607099284523db/_-42RD6RGPB-BZ51evsNc.png)
30
 
31
+
32
+ ## GPT4ALL Chat Template
33
+ ```
34
+ {{- '<|im_start|>system\n' }}
35
+ {% if toolList|length > 0 %}You have access to the following functions:
36
+ {% for tool in toolList %}
37
+ Use the function '{{tool.function}}' to: '{{tool.description}}'
38
+ {% if tool.parameters|length > 0 %}
39
+ parameters:
40
+ {% for info in tool.parameters %}
41
+ {{info.name}}:
42
+ type: {{info.type}}
43
+ description: {{info.description}}
44
+ required: {{info.required}}
45
+ {% endfor %}
46
+ {% endif %}
47
+ # Tool Instructions
48
+ If you CHOOSE to call this function ONLY reply with the following format:
49
+ '{{tool.symbolicFormat}}'
50
+ Here is an example. If the user says, '{{tool.examplePrompt}}', then you reply
51
+ '{{tool.exampleCall}}'
52
+ After the result you might reply with, '{{tool.exampleReply}}'
53
+ {% endfor %}
54
+ You MUST include both the start and end tags when you use a function.
55
+
56
+ You are a helpful aware AI assistant made by Intelligent Estate who uses the functions to break down, analyze, perform, and verify complex reasoning tasks. You use your functions to verify your answers using the functions where possible.
57
+ {% endif %}
58
+ {{- '<|im_end|>\n' }}
59
+ {% for message in messages %}
60
+ {{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n' }}
61
+ {% endfor %}
62
+ {% if add_generation_prompt %}
63
+ {{ '<|im_start|>assistant\n' }}
64
+ {% endif %}
65
+ ```
66
+
67
  This model was converted to GGUF format from [`cognitivecomputations/Dolphin3.0-Qwen2.5-3b`](https://huggingface.co/cognitivecomputations/Dolphin3.0-Qwen2.5-3b) using llama.cpp
68
  Refer to the [original model card](https://huggingface.co/cognitivecomputations/Dolphin3.0-Qwen2.5-3b) for more details on the model.
69