lamhieu commited on
Commit
6ee27ed
1 Parent(s): 08005ee

Upload folder using huggingface_hub

Browse files
README.md CHANGED
@@ -1,5 +1,797 @@
1
- ---
2
- license: other
3
- license_name: .
4
- license_link: LICENSE
5
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: ghost-x/ghost-8b-beta
3
+ language:
4
+ - en
5
+ - vi
6
+ - es
7
+ - pt
8
+ - de
9
+ - it
10
+ - fr
11
+ - ko
12
+ - zh
13
+ license: other
14
+ license_name: ghost-open-llms
15
+ license_link: https://ghost-x.org/ghost-open-llms-license
16
+ tags:
17
+ - ghost
18
+ - tools
19
+ - chat
20
+ - transformers
21
+ - unsloth
22
+ - llama
23
+ pipeline_tag: text-generation
24
+ widget:
25
+ - text: Why is the sky blue ?
26
+ ---
27
+
28
+
29
+ <p><img src="./images/logo.jpeg" width="40%" align="center" /></p>
30
+
31
+ A large language model was developed with goals including excellent multilingual support, superior knowledge capabilities and cost efficiency.
32
+
33
+ ## Introduction
34
+
35
+ **Ghost 8B Beta (Llama 3 - Ghost 8B Beta)** is a large language model developed with goals that include excellent multilingual support, superior knowledge capabilities, and cost-effectiveness. The model comes in two context length versions, 8k and 128k, along with multilingual function tools support by default.
36
+
37
+ The Ghost 8B Beta model outperforms prominent models such as Llama 3.1 8B Instruct, GPT 3.5 Turbo in the lc_winrate score. In addition, it also outperforms Claude 3 Opus, Claude 3 Sonnet, GPT-4, and Mistral Large when comparing the winrate score of AlpacaEval 2.0, [\*](https://ghost-x.org/docs/models/ghost-8b-beta/#alpacaeval-20).
38
+
39
+ ### Updates
40
+
41
+ * **16 Aug 2024**: The model has been released to version 160824, expanding support from 9 languages ​​to 16 languages. The model has improved math, reasoning, and following instructions better than the previous version.
42
+
43
+ ### Thoughts
44
+
45
+ We believe that it is possible to optimize language models that are not too large to achieve better capabilities in terms of cross-linguistic understanding and solving complex tasks. The potential of these models is often mentioned as being cost-effective when deployed and operated at the production level for both large businesses and startups. By doing this well, we can partly eliminate worries about the cost of GPUs that hinder the development of useful A.I ideas and products for humans.
46
+
47
+ ### Specifications
48
+
49
+ - Name: **Ghost 8B Beta (aka: Llama 3 - Ghost 8B Beta)**.
50
+ - Version: **160824 (aka: 1608)**, previously **disl-0x5 (aka: d0x5)**.
51
+ - Model size: 8 billion parameters.
52
+ - Context length: 8K, 8192 / 128K, 131072.
53
+ - Languages: 🇬🇧 English, 🇻🇳 Vietnamese, 🇰🇷 Korean, 🇪🇸 Spanish, 🇵🇹 Portuguese, 🇨🇳 Chinese, 🇫🇷 French, 🇮🇹 Italian, 🇩🇪 German, 🇯🇵 Japanese, 🇷🇺 Russian, 🇵🇱 Polish, 🇳🇱 Dutch, 🇮🇳 Hindi, 🇹🇷 Turkish, 🇮🇩 Indonesian.
54
+ - Main tasks: as a pretrained model, chat, multi-tasking and function tools.
55
+ - License: [Ghost Open LLMs LICENSE](https://ghost-x.org/ghost-open-llms-license), [Llama 3 LICENSE](https://huggingface.co/meta-llama/Meta-Llama-3-8B/blob/main/LICENSE).
56
+ - Distributions: Standard (BF16), GGUF, AWQ.
57
+ - Developed by: **Ghost X**, [Hieu Lam](https://huggingface.co/lamhieu).
58
+
59
+ #### Links
60
+
61
+ - Official website: [Ghost 8B Beta](https://ghost-x.org/docs/models/ghost-8b-beta/).
62
+ - Online demo:
63
+ - [Playground with Ghost 8B Beta (β, 8k) on Spaces](https://huggingface.co/spaces/lamhieu/ghost-8b-beta-8k).
64
+ - [Playground with Ghost 8B Beta (β, 128k) on Spaces](https://huggingface.co/spaces/lamhieu/ghost-8b-beta-128k).
65
+
66
+ ### Distributions
67
+
68
+ We create many distributions to give you the best access options that best suit your needs.
69
+
70
+ | Version | Model card |
71
+ | ------- | ------------------------------------------------------------------- |
72
+ | BF16 | [🤗 HuggingFace](https://huggingface.co/ghost-x/ghost-8b-beta-1608) |
73
+ | GGUF | [🤗 HuggingFace](https://huggingface.co/ghost-x/ghost-8b-beta-1608-gguf) |
74
+ | AWQ | [🤗 HuggingFace](https://huggingface.co/ghost-x/ghost-8b-beta-1608-awq) |
75
+
76
+ ### License
77
+
78
+ The Ghost 8B Beta model is released under the [Ghost Open LLMs LICENSE](https://ghost-x.org/ghost-open-llms-license), [Llama 3 LICENSE](https://huggingface.co/meta-llama/Meta-Llama-3-8B/blob/main/LICENSE).
79
+
80
+ ### Techniques
81
+
82
+ The model is further trained based on [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B).
83
+
84
+ **Ghost 8B Beta** has been continual pre-training and fine-tuned with a recipe given the funny name "Teach the little boy how to cook Saigon Pho".
85
+
86
+ This technique is designed to include model training with 3 main stages:
87
+
88
+ - In stages 1 and 2, the "multilingual buffer" method (in short) will be used to perform fine-tuning. This method helps achieve efficiency in terms of language understanding and knowledge sharing for the model with low training costs and does not require too much sample data.
89
+ - In stage 3, the model will be refined based on human feedback.
90
+
91
+ Note, with this recipe we are able to reproduce this model exactly. More specifically, all training source code (both forks and patches to libraries) is archived and reproducible.
92
+
93
+ ## Templates / Concepts
94
+
95
+ The Ghost 8B Beta model comes with a built-in "chat template" if used via "transformers" so everything is quite simple. Here we will learn what the model supports and how to optimize its use to get the best quality of generated text.
96
+
97
+ The model supports the following roles:
98
+
99
+ - "system": This is the system role that sets information about the model or instructs the model to follow commands. It should be placed at the top and set empty if not in use.
100
+ - "user": This is the role that represents the user's chat content.
101
+ - "assistant": This is the role that represents the model's chat content.
102
+ - "refs": This is the role that represents trusted reference data for the model (RAG usage). It must be placed after "system" and after "tool:schemas" if tools are used.
103
+ - "tool:schemas": This is a role to enter information about the tools that the model is allowed to use during work. It must be placed after "system" and can be omitted when not in use.
104
+ - "tool:execute": This is a role to rewrite the model's tool execution action.
105
+ - "tool:results": This is the role to write down the results received after using the tool.
106
+
107
+ <details close>
108
+ <summary>See chat configuration template</summary>
109
+ ~
110
+
111
+ ```jinja
112
+ {{ bos_token }}{% for message in messages %}
113
+ {% set role = message['role'] %}
114
+ {% if role == 'tools' %}
115
+ {% set role = 'tool:schemas' %}
116
+ {% elif role == 'execute' %}
117
+ {% set role = 'tool:execute' %}
118
+ {% elif role == 'response' %}
119
+ {% set role = 'tool:results' %}
120
+ {% endif %}
121
+ {% set content = message['content'] | trim + '<|cos|>' %}
122
+ {% if role == 'system' %}
123
+ {% set content = '<|role:begin|>system<|role:end|>
124
+ ' + content %}
125
+ {% elif role == 'user' %}
126
+ {% set content = '<|role:begin|>user<|role:end|>
127
+ ' + content %}
128
+ {% elif role == 'assistant' %}
129
+ {% set content = '<|role:begin|>assistant<|role:end|>
130
+ ' + content %}
131
+ {% elif role == 'refs' %}
132
+ {% set content = '<|role:begin|>references<|role:end|>
133
+ ' + content %}
134
+ {% elif role == 'tool:schemas' %}
135
+ {% set content = '<|role:begin|>tools<|role:end|>
136
+ ' + content %}
137
+ {% elif role == 'tool:execute' %}
138
+ {% set content = '<|role:begin|>assistant<|role:end|>
139
+ <|tool:execute|>' + content %}
140
+ {% elif role == 'tool:results' %}
141
+ {% set content = '<|role:begin|>user<|role:end|>
142
+ <|tool:results|>' + content %}
143
+ {% endif %}
144
+ {{ content }}
145
+ {% if loop.last and add_generation_prompt %}
146
+ {{ '<|role:begin|>assistant<|role:end|>' }}
147
+ {% endif %}
148
+ {% endfor %}
149
+ ```
150
+
151
+ </details>
152
+
153
+ To understand better, let's see [how to use](#usage) details.
154
+
155
+ ## Usage
156
+
157
+ This content will be updated soon.
158
+
159
+ ### Directly
160
+
161
+ To use the model directly, there are many ways to get started, choose one of the following ways to experience it.
162
+
163
+ #### Transformers
164
+
165
+ For direct use with `transformers`, you can easily get started with the following steps.
166
+
167
+ - Firstly, you need to install transformers via the command below with `pip`.
168
+
169
+ ```bash
170
+ pip install -U transformers
171
+ ```
172
+
173
+ - Right now, you can start using the model directly.
174
+
175
+ ```python
176
+ import torch
177
+ from transformers import (
178
+ AutoModelForCausalLM,
179
+ AutoTokenizer,
180
+ )
181
+
182
+ base_model = "ghost-x/ghost-8b-beta"
183
+ model = AutoModelForCausalLM.from_pretrained(
184
+ base_model,
185
+ torch_dtype=torch.bfloat16,
186
+ device_map="auto",
187
+ )
188
+ tokenizer = AutoTokenizer.from_pretrained(base_model)
189
+
190
+ messages = [
191
+ {"role": "system", "content": ""},
192
+ {"role": "user", "content": "Why is the sky blue ?"},
193
+ ]
194
+ prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
195
+ inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False)
196
+ for k,v in inputs.items():
197
+ inputs[k] = v.cuda()
198
+ outputs = model.generate(**inputs, max_new_tokens=512, do_sample=True, top_k=50, top_p=0.95, temperature=0.4)
199
+ results = tokenizer.batch_decode(outputs)[0]
200
+ print(results)
201
+ ```
202
+
203
+ - Additionally, you can also use a model with **4bit quantization** to reduce the required resources at least. You can start with the code below.
204
+
205
+ ```python
206
+ import torch
207
+ from transformers import (
208
+ AutoModelForCausalLM,
209
+ AutoTokenizer,
210
+ BitsAndBytesConfig,
211
+ )
212
+
213
+ base_model = "ghost-x/ghost-8b-beta"
214
+ bnb_config = BitsAndBytesConfig(
215
+ load_in_4bit=True,
216
+ bnb_4bit_quant_type="nf4",
217
+ bnb_4bit_compute_dtype=torch.bfloat16,
218
+ bnb_4bit_use_double_quant=False,
219
+ )
220
+ model = AutoModelForCausalLM.from_pretrained(
221
+ base_model,
222
+ quantization_config=bnb_config,
223
+ device_map="auto",
224
+ )
225
+ tokenizer = AutoTokenizer.from_pretrained(base_model)
226
+
227
+ messages = [
228
+ {"role": "system", "content": ""},
229
+ {"role": "user", "content": "Why is the sky blue ?"},
230
+ ]
231
+ prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
232
+ inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False)
233
+ for k,v in inputs.items():
234
+ inputs[k] = v.cuda()
235
+ outputs = model.generate(**inputs, max_new_tokens=512, do_sample=True, top_k=50, top_p=0.95, temperature=0.4)
236
+ results = tokenizer.batch_decode(outputs)[0]
237
+ print(results)
238
+
239
+ ```
240
+
241
+ #### Unsloth
242
+
243
+ For direct use with `unsloth`, you can easily get started with the following steps.
244
+
245
+ - Firstly, you need to install unsloth via the command below with `pip`.
246
+
247
+ ```python
248
+ import torch
249
+ major_version, minor_version = torch.cuda.get_device_capability()
250
+ # Must install separately since Colab has torch 2.2.1, which breaks packages
251
+ !pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
252
+ if major_version >= 8:
253
+ # Use this for new GPUs like Ampere, Hopper GPUs (RTX 30xx, RTX 40xx, A100, H100, L40)
254
+ !pip install --no-deps packaging ninja einops flash-attn xformers trl peft accelerate bitsandbytes
255
+ else:
256
+ # Use this for older GPUs (V100, Tesla T4, RTX 20xx)
257
+ !pip install --no-deps xformers trl peft accelerate bitsandbytes
258
+ pass
259
+ ```
260
+
261
+ - Initialize and optimize the model before use.
262
+
263
+ ```python
264
+ from unsloth import FastLanguageModel
265
+ import os, torch, pprint
266
+
267
+ base_model = "ghost/ghost-8b-beta"
268
+ model, tokenizer = FastLanguageModel.from_pretrained(
269
+ model_name = base_model,
270
+ max_seq_length = 8192,
271
+ dtype = None,
272
+ load_in_4bit = False, # Change to `True` if you want to use 4bit quantization.
273
+ )
274
+ FastLanguageModel.for_inference(model)
275
+ ```
276
+
277
+ - Right now, you can start using the model directly.
278
+ ```python
279
+ messages = [
280
+ {"role": "system", "content": ""},
281
+ {"role": "user", "content": "Why is the sky blue ?"},
282
+ ]
283
+ prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
284
+ inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False)
285
+ for k,v in inputs.items():
286
+ inputs[k] = v.cuda()
287
+ outputs = model.generate(**inputs, max_new_tokens=512, do_sample=True, top_k=50, top_p=0.95, temperature=0.4)
288
+ results = tokenizer.batch_decode(outputs)[0]
289
+ print(results)
290
+ ```
291
+
292
+ ### Instructions
293
+
294
+ Here are specific instructions and explanations for each use case.
295
+
296
+ #### Dialogue
297
+
298
+ This is an example of a multi-turns conversation between the assistant and the user.
299
+
300
+ Let's start with "messages" with system and user questions as follows:
301
+
302
+ ```python
303
+ messages = [
304
+ {"role": "system", "content": ""},
305
+ {"role": "user", "content": "What is the significance of the Higgs boson in the Standard Model of particle physics?"},
306
+ ]
307
+ ```
308
+
309
+ Then adding the assistant's answer generated the same new question from the user.
310
+
311
+ ```python
312
+ messages = [
313
+ {"role": "system", "content": ""},
314
+ {"role": "user", "content": "What is the significance of the Higgs boson in the Standard Model of particle physics?"},
315
+ {"role": "assistant", "content": "..."},
316
+ {"role": "user", "content": "Identify the author of a novel that features a dystopian society where \"Big Brother\" watches over its citizens and the protagonist works for the Ministry of Truth."},
317
+ ]
318
+ ```
319
+
320
+ Done, that's how to create ongoing conversations with chat templates.
321
+
322
+ #### Retrieval-Augmented Generation
323
+
324
+ For "Retrieval-Augmented Generation", the best way to add reference information without having to change the prompt system is to use the "refs" role, see the following example.
325
+
326
+ ```python
327
+ refs = json.dumps(
328
+ {
329
+ "instructions": "These are only general documents used for reference to give the most accurate and honest answers possible. Ignore it if it's irrelevant and don't overuse it.",
330
+ "documents": [
331
+ # The content of the reference documents is here.
332
+ # The model will only rely on reference content to answer if relevant.
333
+ "Paracetamol poisoning, also known as acetaminophen poisoning, is caused by excessive use of the medication paracetamol (acetaminophen). Most people have few or non-specific symptoms in the first 24 hours following overdose. These include feeling tired, abdominal pain, or nausea. This is typically followed by a couple of days without any symptoms, after which yellowish skin, blood clotting problems, and confusion occurs as a result of liver failure. Additional complications may include kidney failure, pancreatitis, low blood sugar, and lactic acidosis. If death does not occur, people tend to recover fully over a couple of weeks. Without treatment, death from toxicity occurs 4 to 18 days later.Paracetamol poisoning can occur accidentally or as an attempt to die by suicide. Risk factors for toxicity include alcoholism, malnutrition, and the taking of certain other hepatotoxic medications. Liver damage results not from paracetamol itself, but from one of its metabolites, N-acetyl-p-benzoquinone imine (NAPQI). NAPQI decreases the livers glutathione and directly damages cells in the liver. Diagnosis is based on the blood level of paracetamol at specific times after the medication was taken. These values are often plotted on the Rumack-Matthew nomogram to determine level of concern.Treatment may include activated charcoal if the person seeks medical help soon after the overdose. Attempting to force the person to vomit is not recommended. If there is a potential for toxicity, the antidote acetylcysteine is recommended. The medication is generally given for at least 24 hours. Psychiatric care may be required following recovery. A liver transplant may be required if damage to the liver becomes severe. The need for transplant is often based on low blood pH, high blood lactate, poor blood clotting, or significant hepatic encephalopathy. With early treatment liver failure is rare. Death occurs in about 0.1% of cases.Paracetamol poisoning was first described in the 1960s. Rates of poisoning vary significantly between regions of the world. In the United States more than 100,000 cases occur a year. In the United Kingdom it is the medication responsible for the greatest number of overdoses. Young children are most commonly affected. In the United States and the United Kingdom, paracetamol is the most common cause of acute liver failure. Signs and symptoms The signs and symptoms of paracetamol toxicity occur in three phases. The first phase begins within hours of overdose, and consists of nausea, vomiting, a pale appearance, and sweating. However, patients often have no specific symptoms or only mild symptoms in the first 24 hours of poisoning. Rarely, after massive overdoses, patients may develop symptoms of metabolic acidosis and coma early in the course of poisoning.The second phase occurs between 24 hours and 72 hours following overdose and consists of signs of increasing liver damage. In general, damage occurs in liver cells as they metabolize the paracetamol. The individual may experience right upper quadrant abdominal pain. The increasing liver damage also changes biochemical markers of liver function; International normalized ratio (INR) and the liver transaminases ALT and AST rise to abnormal levels. Acute kidney failure may also occur during this phase, typically caused by either hepatorenal syndrome or multiple organ dysfunction syndrome. In some cases, acute kidney failure may be the primary clinical manifestation of toxicity. In these cases, it has been suggested that the toxic metabolite is produced more in the kidneys than in the liver.The third phase follows at 3 to 5 days, and is marked by complications of massive liver necrosis leading to fulminant liver failure with complications of coagulation defects, low blood sugar, kidney failure, hepatic encephalopathy, brain swelling, sepsis, multiple organ failure, and death. If the third phase is survived, the liver necrosis runs its course, and liver and kidney function typically return to normal in a few weeks. The severity of paracetamol toxicity varies depending on the dose and whether appropriate treatment is received. Cause The toxic dose of paracetamol is highly variable. In general the recommended maximum daily dose for healthy adults is 4 grams. Higher doses lead to increasing risk of toxicity. In adults, single doses above 10 grams or 200 mg/kg of bodyweight, whichever is lower, have a reasonable likelihood of causing toxicity. Toxicity can also occur when multiple smaller doses within 24 hours exceed these levels. Following a dose of 1 gram of paracetamol four times a day for two weeks, patients can expect an increase in alanine transaminase in their liver to typically about three times the normal value. It is unlikely that this dose would lead to liver failure. Studies have shown significant hepatotoxicity is uncommon in patients who have taken greater than normal doses over 3 to 4 days. In adults, a dose of 6 grams a day over the preceding 48 hours could potentially lead to toxicity, while in children acute doses above 200 mg/kg could potentially cause toxicity. Acute paracetamol overdose in children rarely causes illness or death, and it is very uncommon for children to have levels that require treatment, with chronic larger-than-normal doses being the major cause of toxicity in children.Intentional overdosing (self-poisoning, with suicidal intent) is frequently implicated in paracetamol toxicity. In a 2006 review, paracetamol was the most frequently ingested compound in intentional overdosing.In rare individuals, paracetamol toxicity can result from normal use. This may be due to individual ("idiosyncratic") differences in the expression and activity of certain enzymes in one of the metabolic pathways that handle paracetamol (see paracetamols metabolism). Risk factors A number of factors can potentially increase the risk of developing paracetamol toxicity. Chronic excessive alcohol consumption can induce CYP2E1, thus increasing the potential toxicity of paracetamol. In one study of patients with liver injury, 64% reported alcohol intakes of greater than 80 grams a day, while 35% took 60 grams a day or less. Whether chronic alcoholism should be considered a risk factor has been debated by some clinical toxicologists. For chronic alcohol users, acute alcohol ingestion at the time of a paracetamol overdose may have a protective effect. For non-chronic alcohol users, acute alcohol consumption had no protective effect. Fasting is a risk factor, possibly because of depletion of liver glutathione reserves. The concomitant use of the CYP2E1 inducer isoniazid increases the risk of hepatotoxicity, though whether 2E1 induction is related to the hepatotoxicity in this case is unclear. Concomitant use of other drugs that induce CYP enzymes, such as antiepileptics including carbamazepine, phenytoin, and barbiturates, have also been reported as risk factors. Pathophysiology When taken in normal therapeutic doses, paracetamol has been shown to be safe. Following a therapeutic dose, it is mostly converted to nontoxic metabolites via Phase II metabolism by conjugation with sulfate and glucuronide, with a small portion being oxidized via the cytochrome P450 enzyme system. Cytochromes P450 2E1 and 3A4 convert approximately 5% of paracetamol to a highly reactive intermediary metabolite, N-acetyl-p-benzoquinone imine (NAPQI). Under normal conditions, NAPQI is detoxified by conjugation with glutathione to form cysteine and mercapturic acid conjugates.In cases of paracetamol overdose, the sulfate and glucuronide pathways become saturated, and more paracetamol is shunted to the cytochrome P450 system to produce NAPQI. As a result, hepatocellular supplies of glutathione become depleted, as the demand for glutathione is higher than its regeneration. NAPQI therefore remains in its toxic form in the liver and reacts with cellular membrane molecules, resulting in widespread hepatocyte damage and death, leading to acute liver necrosis. In animal studies, the livers stores of glutathione must be depleted to less than 70% of normal levels before liver toxicity occurs. Diagnosis A persons history of taking paracetamol is somewhat accurate for the diagnosis. The most effective way to diagnose poisoning is by obtaining a blood paracetamol level. A drug nomogram developed in 1975, called the Rumack-Matthew nomogram, estimates the risk of toxicity based on the serum concentration of paracetamol at a given number of hours after ingestion. To determine the risk of potential hepatotoxicity, the paracetamol level is traced along the nomogram. Use of a timed serum paracetamol level plotted on the nomogram appears to be the best marker indicating the potential for liver injury. A paracetamol level drawn in the first four hours after ingestion may underestimate the amount in the system because paracetamol may still be in the process of being absorbed from the gastrointestinal tract. Therefore, a serum level taken before 4 hours is not recommended.Clinical or biochemical evidence of liver toxicity may develop in one to four days, although, in severe cases, it may be evident in 12 hours. Right-upper-quadrant tenderness may be present and can aid in diagnosis. Laboratory studies may show evidence of liver necrosis with elevated AST, ALT, bilirubin, and prolonged coagulation times, particularly an elevated prothrombin time. After paracetamol overdose, when AST and ALT exceed 1000 IU/L, paracetamol-induced hepatotoxicity can be diagnosed. In some cases, the AST and ALT levels can exceed 10,000 IU/L. Detection in body fluids Paracetamol may be quantified in blood, plasma, or urine as a diagnostic tool in clinical poisoning situations or to aid in the medicolegal investigation of suspicious deaths. The concentration in serum after a typical dose of paracetamol usually peaks below 30 mg/L, which equals 200 μmol/L. Levels of 30–300 mg/L (200–2000 μmol/L) are often observed in overdose patients. Postmortem blood levels have ranged from 50 to 400 mg/L in persons dying due to acute overdosage. Automated colorimetric techniques, gas chromatography and liquid chromatography are currently in use for the laboratory analysis of the drug in physiological specimens. Prevention Limitation of availability Limiting the availability of paracetamol tablets has been attempted in some countries. In the UK, sales of over-the-counter paracetamol are restricted to packs of 32 x 500 mg tablets in pharmacies, and 16 x 500 mg tablets in non-pharmacy outlets. Pharmacists may provide up to 100 tablets for those with chronic conditions at the pharmacists discretion. In Ireland, the limits are 24 and 12 tablets, respectively. Subsequent study suggests that the reduced availability in large numbers had a significant effect in reducing poisoning deaths from paracetamol overdose.One suggested method of prevention is to make paracetamol a prescription-only medicine, or to remove it entirely from the market. However, overdose is a relatively minor problem; for example, 0.08% of the UK population (over 50 thousand people) present with paracetamol overdose each year. In contrast, paracetamol is a safe and effective medication that is taken without complications by millions of people. In addition, alternative pain relief medications such as aspirin are more toxic in overdose, whereas non-steroidal anti-inflammatory drugs are associated with more adverse effects following normal use. Combination with other agents One strategy for reducing harm done by acetaminophen overdoses is selling paracetamol pre-combined in tablets either with an emetic or an antidote. Paradote was a tablet sold in the UK which combined 500 mg paracetamol with 100 mg methionine, an amino acid formerly used in the treatment of paracetamol overdose. There have been no studies so far on the effectiveness of paracetamol when given in combination with its most commonly used antidote, acetylcysteine.Calcitriol, the active metabolite of vitamin D3, appears to be a catalyst for glutathione production. Calcitriol was found to increase glutathione levels in rat astrocyte primary cultures on average by 42%, increasing glutathione protein concentrations from 29 nmol/mg to 41 nmol/mg, 24 and 48 hours after administration; it continued to have an influence on glutathione levels 96 hours after administration. It has been proposed that co-administration of calcitriol, via injection, may improve treatment outcomes. Paracetamol replacements Paracetamol ester prodrug with L-pyroglutamic acid (PCA), a biosynthetic precursor of glutathione, has been synthesized to reduce paracetamol hepatotoxicity and improve bioavailability. The toxicological studies of different paracetamol esters show that L-5-oxo-pyrrolidine-2-paracetamol carboxylate reduces toxicity after administration of an overdose of paracetamol to mice. The liver glutathione values in mice induced by intraperitoneal injection of the ester are superimposable with the GSH levels recorded in untreated mice control group. The mice group treated with an equivalent dose of paracetamol showed a significative decrease of glutathione of 35% (p<0.01 vs untreated control group). The oral LD50 was found to be greater than 2000 mg kg-1, whereas the intraperitoneal LD50 was 1900 mg kg-1. These results taken together with the good hydrolysis and bioavailability data show that this ester is a potential candidate as a prodrug of paracetamol. Treatment Gastric decontamination In adults, the initial treatment for paracetamol overdose is gastrointestinal decontamination. Paracetamol absorption from the gastrointestinal tract is complete within two hours under normal circumstances, so decontamination is most helpful if performed within this timeframe. Gastric lavage, better known as stomach pumping, may be considered if the amount ingested is potentially life-threatening and the procedure can be performed within 60 minutes of ingestion. Activated charcoal is the most common gastrointestinal decontamination procedure as it adsorbs paracetamol, reducing its gastrointestinal absorption. Administering activated charcoal also poses less risk of aspiration than gastric lavage.It appears that the most benefit from activated charcoal is gained if it is given within 30 minutes to two hours of ingestion. Administering activated charcoal later than 2 hours can be considered in patients that may have delayed gastric emptying due to co-ingested drugs or following ingestion of sustained- or delayed-release paracetamol preparations. Activated charcoal should also be administered if co-ingested drugs warrant decontamination. There was reluctance to give activated charcoal in paracetamol overdose, because of the concern that it may also absorb the oral antidote acetylcysteine. Studies have shown that 39% less acetylcysteine is absorbed into the body when they are administered together. There are conflicting recommendations regarding whether to change the dosing of oral acetylcysteine after the administration of activated charcoal, and even whether the dosing of acetylcysteine needs to be altered at all. Intravenous acetylcysteine has no interaction with activated charcoal. Inducing vomiting with syrup of ipecac has no role in paracetamol overdose because the vomiting it induces delays the effective administration of activated charcoal and oral acetylcysteine. Liver injury is extremely rare after acute accidental ingestion in children under 6 years of age. Children with accidental exposures do not require gastrointestinal decontamination with either gastric lavage, activated charcoal, or syrup of ipecac. Acetylcysteine Acetylcysteine, also called N-acetylcysteine or NAC, works to reduce paracetamol toxicity by replenishing body stores of the antioxidant glutathione. Glutathione reacts with the toxic NAPQI metabolite so that it does not damage cells and can be safely excreted. NAC was usually given following a treatment nomogram (one for patients with risk factors, and one for those without) but the use of the nomogram is no longer recommended as the evidence base to support the use of risk factors was poor and inconsistent and many of the risk factors are imprecise and difficult to determine with sufficient certainty in clinical practice. Cysteamine and methionine have also been used to prevent hepatotoxicity, although studies show that both are associated with more adverse effects than acetylcysteine. Additionally, acetylcysteine has been shown to be a more effective antidote, particularly in patients presenting greater than 8 hours post-ingestion and for those who present with liver failure symptoms.If the person presents less than eight hours after paracetamol overdose, then acetylcysteine significantly reduces the risk of serious hepatotoxicity and guarantees survival. If acetylcysteine is started more than 8 hours after ingestion, there is a sharp decline in its effectiveness because the cascade of toxic events in the liver has already begun, and the risk of acute liver necrosis and death increases dramatically. Although acetylcysteine is most effective if given early, it still has beneficial effects if given as late as 48 hours after ingestion. If the person presents more than eight hours after the paracetamol overdose, then activated charcoal is not useful, and acetylcysteine is started immediately. In earlier presentations, charcoal can be given when the patient arrives and acetylcysteine is initiated while waiting for the paracetamol level results to return from the laboratory.In United States practice, intravenous (IV) and oral administration are considered to be equally effective and safe if given within 8 hours of ingestion. However, IV is the only recommended route in Australasian and British practice. Oral acetylcysteine is given as a 140 mg/kg loading dose followed by 70 mg/kg every four hours for 17 more doses, and if the patient vomits within 1 hour of dose, the dose must be repeated. Oral acetylcysteine may be poorly tolerated due to its unpleasant taste, odor, and its tendency to cause nausea and vomiting. If repeated doses of charcoal are indicated because of another ingested drug, then subsequent doses of charcoal and acetylcysteine should be staggered.Intravenous acetylcysteine is given as a continuous infusion over 20 hours for a total dose 300 mg/kg. Recommended administration involves infusion of a 150 mg/kg loading dose over 15 to 60 minutes, followed by a 50 mg/kg infusion over four hours; the last 100 mg/kg are infused over the remaining 16 hours of the protocol. Intravenous acetylcysteine has the advantage of shortening hospital stay, increasing both doctor and patient convenience, and allowing administration of activated charcoal to reduce absorption of both the paracetamol and any co-ingested drugs without concerns about interference with oral acetylcysteine. Intravenous dosing varies with weight, specifically in children. For patients less than 20 kg, the loading dose is 150 mg/kg in 3 mL/kg diluent, administered over 60 minutes; the second dose is 50 mg/kg in 7 mL/kg diluent over 4 hours; and the third and final dose is 100 mg/kg in 14 mL/kg diluent over 16 hours.The most common adverse effect to acetylcysteine treatment is an anaphylactoid reaction, usually manifested by rash, wheeze, or mild hypotension. May cause infertility or death. Adverse reactions are more common in people treated with IV acetylcysteine, occurring in up to 20% of patients. Anaphylactoid reactions are more likely to occur with the first infusion (the loading dose). Rarely, severe life-threatening reactions may occur in predisposed individuals, such as patients with asthma or atopic dermatitis, and may be characterized by respiratory distress, facial swelling, and even death.If an anaphylactoid reaction occurs the acetylcysteine is temporarily halted or slowed and antihistamines and other supportive care is administered. For example, a nebulised beta-agonist like salbutamol may be indicated in the event of significant bronchospasm (or prophylactically in patients with a history of bronchospasm secondary to acetylcysteine). It is also important to closely monitor fluids and electrolytes. Liver transplant In people who develop acute liver failure or who are otherwise expected to die from liver failure, the mainstay of management is liver transplantation. Liver transplants are performed in specialist centers. The most commonly used criteria for liver transplant were developed by physicians at Kings College Hospital in London. Patients are recommended for transplant if they have an arterial blood pH less than 7.3 after fluid resuscitation or if a patient has Grade III or IV encephalopathy, a prothrombin time greater than 100 seconds, and a serum creatinine greater than 300 mmol/L In a 24-hour period. Other forms of liver support have been used including partial liver transplants. These techniques have the advantage of supporting the patient while their own liver regenerates. Once liver function returns immunosuppressive drugs are commenced and they have to take immunosuppressive medication for the rest of their lives. Prognosis The mortality rate from paracetamol overdose increases two days after the ingestion, reaches a maximum on day four, and then gradually decreases. Acidosis is the most important single indicator of probable mortality and the need for transplantation. A mortality rate of 95% without transplant was reported in patients who had a documented pH less than 7.30. Other indicators of poor prognosis include chronic kidney disease (stage 3 or worse), hepatic encephalopathy, a markedly elevated prothrombin time, or an elevated blood lactic acid level (lactic acidosis). One study has shown that a factor V level less than 10% of normal indicated a poor prognosis (91% mortality), whereas a ratio of factor VIII to factor V of less than 30 indicated a good prognosis (100% survival). Patients with a poor prognosis are usually identified for likely liver transplantation. Patients that do not die are expected to fully recover and have a normal life expectancy and quality of life. Epidemiology Many over-the-counter and prescription-only medications contain paracetamol. Because of its wide availability paired with comparably high toxicity, (compared to ibuprofen and aspirin) there is a much higher potential for overdose. Paracetamol toxicity is one of the most common causes of poisoning worldwide. In the United States, the United Kingdom, Australia, and New Zealand, paracetamol is the most common cause of drug overdoses. Additionally, in both the United States and the United Kingdom it is the most common cause of acute liver failure.In England and Wales an estimated 41,200 cases of paracetamol poisoning occurred in 1989 to 1990, with a mortality of 0.40%. It is estimated that 150 to 200 deaths and 15 to 20 liver transplants occur as a result of poisoning each year in England and Wales. Paracetamol overdose results in more calls to poison control centers in the US than overdose of any other pharmacological substance, accounting for more than 100,000 calls, as well as 56,000 emergency room visits, 2,600 hospitalizations, and 458 deaths due to acute liver failure per year. A study of cases of acute liver failure between November 2000 and October 2004 by the Centers for Disease Control and Prevention in the USA found that paracetamol was the cause of 41% of all cases in adults, and 25% of cases in children. References External links Gerth, Jeff; T. Christian Miller (September 20, 2013). \"Use Only as Directed\". ProPublica. Retrieved October 12, 2013."
334
+ ],
335
+ },
336
+ indent=2,
337
+ ensure_ascii=False,
338
+ )
339
+
340
+ messages = [
341
+ {"role": "system", "content": ""},
342
+ {"role": "refs", "content": refs},
343
+ {"role": "user", "content": "What are the signs of Paracetamol poisoning?"},
344
+ ]
345
+ ```
346
+
347
+ #### Tools
348
+
349
+ To use function tools, first declare the schemas of the tools to Ghost 8B Beta through the role "tool:schemas". Let's look at the example to make it easier to understand.
350
+
351
+ ```python
352
+ tools = json.dumps(
353
+ [
354
+ {
355
+ "type": "function",
356
+ "function": {
357
+ "name": "get_flight_status",
358
+ "description": "Get the status of a specific flight",
359
+ "parameters": {
360
+ "type": "object",
361
+ "properties": {
362
+ "flight_number": {
363
+ "type": "string",
364
+ "description": "The flight number"
365
+ }
366
+ }
367
+ }
368
+ }
369
+ }
370
+ ],
371
+ indent=2,
372
+ ensure_ascii=False,
373
+ )
374
+
375
+ messages = [
376
+ {"role": "system", "content": ""},
377
+ {"role": "tool:schemas", "content": tools},
378
+ {"role": "user", "content": "What is the status of flight AZ123?"},
379
+ ]
380
+ ```
381
+
382
+ In the above example, the model will return the execution information of the tool to use, which is an Object with a structure similar to the content of the "tool:execute" role in the next example. We need to extract and parse it, then call to get the returned results and pass it through the "tool:results" role. Note, "tool:execute" must be passed again. Then, the model's answer will be received back.
383
+
384
+ ```python
385
+ messages = [
386
+ {"role": "system", "content": ""},
387
+ {"role": "tool:schemas", "content": tools},
388
+ {"role": "user", "content": "What is the status of flight AZ123?"},
389
+ {
390
+ "role": "tool:execute",
391
+ "content": json.dumps({
392
+ "type": "function",
393
+ "name": "get_flight_status",
394
+ "arguments": { "flight_number": "AZ123" }
395
+ }, indent=2, ensure_ascii=False)
396
+ },
397
+ {
398
+ "role": "too:results",
399
+ "content": json.dumps({
400
+ "type": "function",
401
+ "name": "get_flight_status",
402
+ "results": {
403
+ "flight_number": "AZ123",
404
+ "departure_airport": "FCO",
405
+ "arrival_airport": "JFK",
406
+ "departure_time": "10:00 AM",
407
+ "arrival_time": "1:00 PM",
408
+ "status": "On time",
409
+ "planned_route": [
410
+ { "latitude": 41.8, "longitude": 12.25 },
411
+ { "latitude": 48.8582, "longitude": 2.2945 },
412
+ { "latitude": 40.6413, "longitude": -73.7781 }
413
+ ]
414
+ }
415
+ }, indent=2, ensure_ascii=False)
416
+ },
417
+ ]
418
+ ```
419
+
420
+ Done, here is an example that simulates how to use function tools. Note, Ghost 8B Beta has support for calling multiple tools at once.
421
+
422
+ ### Deployments
423
+
424
+ For deployment, we recommend using vLLM. You can enable the long-context capabilities by following these steps:
425
+
426
+ - You can install vLLM by running the following command.
427
+
428
+ ```bash
429
+ pip install "vllm>=0.4.3"
430
+ ```
431
+
432
+ - Utilize vLLM to deploy your model. For instance, you can set up an openAI-like server using the command:
433
+
434
+ ```bash
435
+ python -m vllm.entrypoints.openai.api_server --served-model-name ghost-8b-beta --model ghost-x/ghost-8b-beta
436
+ ```
437
+
438
+ - Try it now:
439
+
440
+ ```bash
441
+ curl http://localhost:8000/v1/chat/completions \
442
+ -H "Content-Type: application/json" \
443
+ -d '{
444
+ "model": "ghost-8b-beta",
445
+ "messages": [
446
+ {"role": "system", "content": ""},
447
+ {"role": "user", "content": "Why is the sky blue ?"}
448
+ ]
449
+ }'
450
+ ```
451
+
452
+ Done, just enjoy it.
453
+
454
+ ## Evaluation
455
+
456
+ At an overall level, to evaluate whether a model is effective and high quality or not, it must go through rigorous evaluations that are often trusted by the community such as AlpacaEval 2, MT-Bench, MMLU-Pro, GSM8K, etc.
457
+
458
+ We use the [EleutherAI/lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) library for evaluations to perform benchmarks for common tasks, except for specific tasks such as AlpacaEval 2.0 and MT Bench.
459
+
460
+ Once again, Ghost 8B Beta is a language model with a fairly modest size with the goal to achieve multilingual capabilities (9+, most popular languages), support for function tools with those languages. Furthermore, it also has the ambition to be equal or superior to other models with a size many times larger than the Ghost 8B Beta.
461
+
462
+ Note, to run evaluations for the model, set "max_new_tokens" higher to be able to get the answer because the model will explain before drawing conclusions.
463
+
464
+ ### AlpacaEval 2.0
465
+
466
+ Overview based on evaluation results from this AlpacaEval 2.0:
467
+
468
+ - In the Length-controlled win rates score (namely: lc_winrate). The Ghost 8B Beta model outperformed outstanding models such as the Llama 3 8B Instruct, GPT 3.5 Turbo (06/13) and GPT 3.5 Turbo (06/11). In addition, its score is quite close to larger models such as Mixtral 8x7B v0.1 and Gemini Pro.
469
+ - In the win rates score (namely: winrate). The Ghost 8B Beta model outperformed Claude 3 Opus (02/29), Claude 3 Sonnet (02/29), GPT-4, GPT-4 (03/14), and Mistral Large (02/24).
470
+
471
+ Of course, the AlpacaEval 2.0 review highlights the "Length-controlled (LC) win rates" score, so the "win rates" here are for reference only.
472
+
473
+ | model | avg length | lc winrate | winrate | std error |
474
+ | ------------------------ | ---------- | ---------- | --------- | --------- |
475
+ | GPT-4 Preview (11/06) | 2049 | 50.00 | 50.00 | 0.00 |
476
+ | Claude 3 Opus (02/29) | 1388 | 40.51 | 29.11 | 1.39 |
477
+ | Claude 3 Sonnet (02/29) | 1420 | 34.87 | 25.56 | 1.34 |
478
+ | Llama 3 70B Instruct | 1919 | 34.42 | 33.18 | 1.39 |
479
+ | Gemini Pro | 1456 | 24.38 | 18.18 | 1.16 |
480
+ | Mixtral 8x7B v0.1 | 1465 | 23.69 | 18.26 | 1.19 |
481
+ | **Ghost 8B Beta (d0x5)** | **2430** | **23.12** | **29.14** | **1.32** |
482
+ | Llama 3 8B Instruct | 1899 | 22.92 | 22.57 | 1.26 |
483
+ | GPT 3.5 Turbo (06/13) | 1331 | 22.35 | 14.09 | 0.00 |
484
+ | GPT 3.5 Turbo (11/06) | 796 | 19.30 | 9.17 | 0.00 |
485
+ | Mistral 7B Instruct v0.2 | 1676 | 17.11 | 14.72 | 1.08 |
486
+
487
+ #### A quick talk about AlpacaEval 2.0
488
+
489
+ [AlpacaEval: An Automatic Evaluator for Instruction-following Language Models](https://github.com/tatsu-lab/alpaca_eval)
490
+
491
+ ![LC AlpacaEval is the most highly correlated benchmark with Chat Arena.](https://github.com/tatsu-lab/alpaca_eval/blob/main/figures/chat_correlations_no_ae.png?raw=true)
492
+
493
+ AlpacaEval 2.0 with length-controlled win-rates (paper) has a spearman correlation of 0.98 with ChatBot Arena while costing less than $10 of OpenAI credits run and running in less than 3 minutes. Our goal is to have a benchmark for chat LLMs that is: fast (< 5min), cheap (< $10), and highly correlated with humans (0.98). Here's a comparison with other benchmarks: LC AlpacaEval is the most highly correlated benchmark with Chat Arena.
494
+
495
+ ```tex
496
+ @article{dubois2024length,
497
+ title={Length-Controlled AlpacaEval: A Simple Way to Debias Automatic Evaluators},
498
+ author={Dubois, Yann and Galambosi, Bal{\'a}zs and Liang, Percy and Hashimoto, Tatsunori B},
499
+ journal={arXiv preprint arXiv:2404.04475},
500
+ year={2024}
501
+ }
502
+ ```
503
+
504
+ ### MT Bench
505
+
506
+ Based on MT Bench average scores, the Ghost 8B Beta scores very close to other much larger models, the GPT 3.5 Turbo and Claude v1. In addition, it outperforms other open models of the same size, Vicuna 33B v1.3, WizardLM 30B and Llama 2 70B chat.
507
+
508
+ **Average Scores**
509
+
510
+ | model | score |
511
+ | ------------------------ | ------------ |
512
+ | GPT 4 | 8.990625 |
513
+ | GPT 3.5 Turbo | 7.943750 |
514
+ | Claude v1 | 7.900000 |
515
+ | **Ghost 8B Beta (d0x5)** | **7.740506** |
516
+ | Vicuna 33B v1.3 | 7.121875 |
517
+ | WizardLM 30B | 7.009375 |
518
+ | Llama 2 70B chat | 6.856250 |
519
+
520
+ **Plots**
521
+
522
+ With MT Bench's plot we will see some other more specific things. In STEM, writing and roleplay scores, the model is at the same level as GPT 4, notably the score in STEM is at an absolute level similar to GPT 4.
523
+
524
+ ![MT Bench, Plots](./images/mt-bench-plots.png)
525
+
526
+ #### A quick talk about MT Bench
527
+
528
+ [MT Bench](https://github.com/lm-sys/FastChat/blob/main/fastchat/llm_judge/README.md) is a set of challenging, multi-turn, and open-ended questions for evaluating chat assistants. It uses LLM-as-a-judge to evaluate model responses.
529
+
530
+ ```tex
531
+ @misc{zheng2023judging,
532
+ title={Judging LLM-as-a-judge with MT-Bench and Chatbot Arena},
533
+ author={Lianmin Zheng and Wei-Lin Chiang and Ying Sheng and Siyuan Zhuang and Zhanghao Wu and Yonghao Zhuang and Zi Lin and Zhuohan Li and Dacheng Li and Eric. P Xing and Hao Zhang and Joseph E. Gonzalez and Ion Stoica},
534
+ year={2023},
535
+ eprint={2306.05685},
536
+ archivePrefix={arXiv},
537
+ primaryClass={cs.CL}
538
+ }
539
+ ```
540
+
541
+ ### GSM8K
542
+
543
+ GSM8K is an evaluation to test the mathematical capabilities of a widely trusted and accepted language model. In this evaluation, Ghost 8B Beta was run with "gsm8k_cot_zeroshot", which means answering questions without instructions, the score achieved was approximately 67%, it was very impressive.
544
+
545
+ | tasks | version | filter | n shot | metric | value | std error |
546
+ | ------------------ | ------: | ---------------- | -----: | ----------- | -----: | --------: |
547
+ | gsm8k_cot_zeroshot | 3 | flexible-extract | 0 | exact_match | 0.6641 | 0.0130 |
548
+
549
+ **Leaderboard**
550
+
551
+ The rankings are referenced from this [article](https://klu.ai/glossary/GSM8K-eval).
552
+
553
+ Based on the leaderboard, we can see that Ghost 8B Beta outperforms proprietary models such as xAI Grok 1, OpenAI GPT 3.5 and Mistral Mixtral 8x7b. In addition, Ghost 8B Beta is considered equal to Mistral Medium. Furthermore, only the Ghost 8B Beta, Claude 2 and Claude 3 are rare models that use the zero shot method for evaluation.
554
+
555
+ | model | accuracy | methodology |
556
+ | ------------------------ | --------- | ----------------------------- |
557
+ | Claude 3 | 95% | Zero shot |
558
+ | Gemini Ultra | 94.4% | Majority Vote, 32 Generations |
559
+ | GPT 4 | 92% | SFT & 5-shot CoT |
560
+ | Claude 2 | 88% | Zero shot |
561
+ | Gemini Pro | 86.5% | Majority Vote, 32 Generations |
562
+ | Mistral Large | 81% | 5-shot Learning |
563
+ | PaLM 2 | 80% | 5-shot Learning |
564
+ | Mistral Medium | 66.7% | 5-shot Learning |
565
+ | **Ghost 8B Beta (d0x5)** | **66.4%** | **Zero shot** |
566
+ | xAI Grok 1 | 62.9% | 8-shot Learning |
567
+ | Mistral Mixtral 8x7B | 58.4% | 5-shot Learning |
568
+ | GPT 3.5 | 57.1% | 5-shot Learning |
569
+
570
+ <details close>
571
+ <summary>Snapshot of original results</summary>
572
+ ~
573
+
574
+ ```json
575
+ {
576
+ "results": {
577
+ "gsm8k_cot_zeroshot": {
578
+ "alias": "gsm8k_cot_zeroshot",
579
+ "exact_match,strict-match": 0.576194086429113,
580
+ "exact_match_stderr,strict-match": 0.013611632008810354,
581
+ "exact_match,flexible-extract": 0.6641394996209249,
582
+ "exact_match_stderr,flexible-extract": 0.01300922471426737
583
+ }
584
+ },
585
+ "model_source": "hf",
586
+ "model_name": "lamhieu/ghost-8b-beta-disl-0x5",
587
+ "model_name_sanitized": "lamhieu__ghost-8b-beta-disl-0x5",
588
+ "start_time": 1272225.578433881,
589
+ "end_time": 1276832.731746671,
590
+ "total_evaluation_time_seconds": "4607.153312789975",
591
+ ...
592
+ }
593
+ ```
594
+
595
+ </details>
596
+
597
+ #### A quick talk about GSM8K
598
+
599
+ [Training Verifiers to Solve Math Word Problems](https://arxiv.org/abs/2110.14168)
600
+
601
+ ```tex
602
+ @misc{cobbe2021training,
603
+ title={Training Verifiers to Solve Math Word Problems},
604
+ author={Karl Cobbe and Vineet Kosaraju and Mohammad Bavarian and Jacob Hilton and Reiichiro Nakano and Christopher Hesse and John Schulman},
605
+ year={2021},
606
+ eprint={2110.14168},
607
+ archivePrefix={arXiv},
608
+ primaryClass={cs.LG}
609
+ }
610
+ ```
611
+
612
+ ### GPQA
613
+
614
+ Ghost 8B Beta was evaluated with the task "gpqa_diamond_cot_zeroshot", the model scored approximately 27%. Overall, so you can easily visualize we have compiled a leaderboard so you can understand the model's capabilities in an overview.
615
+
616
+ | tasks | version | filter | n shot | metric | value | std error |
617
+ | ------------------------- | ------: | ---------------- | -----: | ----------- | -----: | --------: |
618
+ | gpqa_diamond_cot_zeroshot | 1 | flexible-extract | 0 | exact_match | 0.2676 | 0.0315 |
619
+
620
+ **Leaderboard**
621
+
622
+ The rankings are referenced from this [article](https://www.anthropic.com/news/claude-3-family).
623
+
624
+ | model | accuracy | methodology |
625
+ | ------------------------ | --------- | ----------------- |
626
+ | Claude 3 Opus | 50.4% | Zero-shot CoT |
627
+ | Claude 3 Sonet | 40.4% | Zero-shot CoT |
628
+ | GPT-4 | 35.7% | Zero-shot CoT |
629
+ | Claude 3 Haiku | 33.3% | Zero-shot CoT |
630
+ | GPT-3.5 | 28.1% | Zero-shot CoT |
631
+ | **Ghost 8B Beta (d0x5)** | **26.7%** | **Zero-shot CoT** |
632
+
633
+ <details close>
634
+ <summary>Snapshot of original results</summary>
635
+ ~
636
+
637
+ ```json
638
+ {
639
+ "results": {
640
+ "gpqa_diamond_cot_zeroshot": {
641
+ "alias": "gpqa_diamond_cot_zeroshot",
642
+ "exact_match,strict-match": 0.005050505050505051,
643
+ "exact_match_stderr,strict-match": 0.005050505050505052,
644
+ "exact_match,flexible-extract": 0.2676767676767677,
645
+ "exact_match_stderr,flexible-extract": 0.03154449888270285
646
+ }
647
+ },
648
+ "model_source": "hf",
649
+ "model_name": "lamhieu/ghost-8b-beta-disl-0x5",
650
+ "model_name_sanitized": "lamhieu__ghost-8b-beta-disl-0x5",
651
+ "start_time": 1290603.02832825,
652
+ "end_time": 1294954.839677157,
653
+ "total_evaluation_time_seconds": "4351.811348907184"
654
+ ...
655
+ }
656
+ ```
657
+
658
+ </details>
659
+
660
+ #### A quick talk about GPQA
661
+
662
+ [GPQA: A Graduate-Level Google-Proof Q&A Benchmark](https://arxiv.org/abs/2311.12022).
663
+
664
+ ```tex
665
+ @misc{rein2023gpqa,
666
+ title={GPQA: A Graduate-Level Google-Proof Q&A Benchmark},
667
+ author={David Rein and Betty Li Hou and Asa Cooper Stickland and Jackson Petty and Richard Yuanzhe Pang and Julien Dirani and Julian Michael and Samuel R. Bowman},
668
+ year={2023},
669
+ eprint={2311.12022},
670
+ archivePrefix={arXiv},
671
+ primaryClass={cs.AI}
672
+ }
673
+ ```
674
+
675
+ ### MMLU Pro
676
+
677
+ Ghost 8B Beta was evaluated with the task "leaderboard_mmlu_pro", the model scored 30%. Overall, so you can easily visualize we have compiled a leaderboard so you can understand the model's capabilities in an overview.
678
+
679
+ | tasks | version | filter | n shot | metric | value | std error |
680
+ | -------------------- | ------: | ------ | -----: | ------ | -----: | --------: |
681
+ | leaderboard_mmlu_pro | 0.1 | none | 5 | acc | 0.3042 | 0.0042 |
682
+
683
+ **Leaderboard**
684
+
685
+ The rankings are referenced from this [leaderboard](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard).
686
+
687
+ | model | accuracy | methodology |
688
+ | ------------------------ | ---------- | ----------- |
689
+ | Qwen 2 72B Instruct | 48.92% | 5-shot |
690
+ | Llama 3 70B Instruct | 46.74% | 5-shot |
691
+ | Deepseek LLM 67B Chat | 32.72% | 5-shot |
692
+ | **Ghost 8B Beta (d0x5)** | **30.42%** | **5-shot** |
693
+ | DBRX Instruct 132B | 29.81% | 5-shot |
694
+ | Llama 3 8B Instruct | 29.60% | 5-shot |
695
+ | SOLAR 10.7B v1.0 | 26.67% | 5-shot |
696
+ | C4AI Command-R 35B | 26.33% | 5-shot |
697
+ | Aya 23 35B | 26.18% | 5-shot |
698
+ | Llama 2 70B Chat | 15.92% | 5-shot |
699
+
700
+ <details close>
701
+ <summary>Snapshot of original results</summary>
702
+ ~
703
+
704
+ ```json
705
+ {
706
+ "results": {
707
+ "leaderboard_mmlu_pro": {
708
+ "alias": "leaderboard_mmlu_pro",
709
+ "acc,none": 0.30418882978723405,
710
+ "acc_stderr,none": 0.004194367367612373
711
+ }
712
+ },
713
+ "model_source": "hf",
714
+ "model_name": "lamhieu/ghost-8b-beta-disl-0x5",
715
+ "model_name_sanitized": "lamhieu__ghost-8b-beta-disl-0x5",
716
+ "start_time": 1759569.60272917,
717
+ "end_time": 1761073.963532963,
718
+ "total_evaluation_time_seconds": "1504.3608037931845",
719
+ ...
720
+ }
721
+ ```
722
+
723
+ </details>
724
+
725
+ #### A quick talk about MMLU Pro
726
+
727
+ [MMLU-Pro: A More Robust and Challenging Multi-Task Language Understanding Benchmark](https://arxiv.org/abs/2311.12022).
728
+
729
+ ```tex
730
+ @misc{wang2024mmluprorobustchallengingmultitask,
731
+ title={MMLU-Pro: A More Robust and Challenging Multi-Task Language
732
+ Understanding Benchmark},
733
+ author={Yubo Wang and Xueguang Ma and Ge Zhang and Yuansheng Ni and Abhranil Chandra and Shiguang Guo and Weiming Ren and Aaran Arulraj and Xuan He and Ziyan Jiang and Tianle Li and Max Ku and Kai Wang and Alex Zhuang and Rongqi Fan and Xiang Yue and Wenhu Chen},
734
+ year={2024},
735
+ eprint={2406.01574},
736
+ archivePrefix={arXiv},
737
+ primaryClass={cs.CL},
738
+ url={https://arxiv.org/abs/2406.01574},
739
+ }
740
+ ```
741
+
742
+ ### There's more
743
+
744
+ Other evaluations will be done soon and updated here later.
745
+
746
+ ## Notes
747
+
748
+ If the model is interesting or helpful to your work, feel free to buy me a beer and let's raise a glass. It will motivate me to improve the model in the future. 🍻
749
+
750
+ <a href="https://www.buymeacoffee.com/lh0x00" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a>
751
+
752
+ ### Thanks
753
+
754
+ The project sends sincere thanks to friends, including...
755
+
756
+ #### Meta
757
+
758
+ <p><img src="https://cdn-avatars.huggingface.co/v1/production/uploads/646cf8084eefb026fb8fd8bc/oCTqufkdTkjyGodsx1vo1.png" width="150px" align="center" /></p>
759
+
760
+ [Meta](https://kaggle.com), for providing a great foundational model.
761
+
762
+ #### Unsloth
763
+
764
+ <p><img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/made with unsloth.png" width="150px" align="center" /></p>
765
+
766
+ [Unsloth](https://unsloth.ai), a great tool that helps us easily develop products, at a lower cost than expected.
767
+
768
+ #### Kaggle
769
+
770
+ <p><img src="https://cdn-uploads.huggingface.co/production/uploads/600ae38cc92b79f54efd4556/dcbpF6YS6RQhqDd6GZZ2v.png" width="150px" align="center" /></p>
771
+
772
+ [Kaggle](https://kaggle.com), free GPUs are available to the community for research.
773
+
774
+ ### Contact
775
+
776
+ If you want to cooperate, contribute, consult or sponsor, please contact us.
777
+
778
+ Follow **Ghost X** to stay updated with the latest information.
779
+
780
+ - Twitter/X via [@ghostx_ai](https://twitter.com/ghostx_ai).
781
+ - HuggingFace via [@ghost-x](https://huggingface.co/ghost-x).
782
+ - Official website [ghost-x.org](https://ghost-x.org/).
783
+ - Email: [lamhieu.vk [at] gmail dot com](mailto:lamhieu.vk@gmail.com) (author)
784
+
785
+ ### Cites
786
+
787
+ If you find our work helpful, feel free to give us a cite.
788
+
789
+ ```tex
790
+ @misc{ghost-8b-beta,
791
+ author = {{Ghost X, Hieu Lam}},
792
+ title = {Ghost 8B Beta},
793
+ url = {https://ghost-x.org/docs/models/ghost-8b-beta},
794
+ year = {2024}
795
+ }
796
+ ```
797
+
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "lamhieu/ghost-8b-beta-1608",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 128000,
9
+ "eos_token_id": 128009,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 4096,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 14336,
14
+ "max_position_embeddings": 8192,
15
+ "mlp_bias": false,
16
+ "model_type": "llama",
17
+ "num_attention_heads": 32,
18
+ "num_hidden_layers": 32,
19
+ "num_key_value_heads": 8,
20
+ "pretraining_tp": 1,
21
+ "rms_norm_eps": 1e-05,
22
+ "rope_scaling": null,
23
+ "rope_theta": 500000.0,
24
+ "tie_word_embeddings": false,
25
+ "torch_dtype": "bfloat16",
26
+ "transformers_version": "4.42.3",
27
+ "use_cache": true,
28
+ "vocab_size": 128256
29
+ }
generation_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 128000,
3
+ "do_sample": true,
4
+ "eos_token_id": [
5
+ 128001,
6
+ 128003,
7
+ 128009
8
+ ],
9
+ "max_length": 4096,
10
+ "temperature": 0.4,
11
+ "top_p": 0.95,
12
+ "transformers_version": "4.42.3"
13
+ }
model-00001-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:af715256e3cd6316684db6e07709c0432cbebb16db2d34d92cb34142a7d096b0
3
+ size 1973455376
model-00002-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ff8545a95de2b083f7b99f109b7a8d588a3a1f3bbe85f566f6fc94d2e1d0faad
3
+ size 1895895336
model-00003-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a0ddd362d8371a5052332b72393d3e35483d195587d470a2312210b1af41f34
3
+ size 1979798040
model-00004-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:19991ed279f3b18db98e308f6e063bb5c54769f92689b58e45121825f85fb29d
3
+ size 1946227368
model-00005-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ac59632011bf01a72fefe79a7bb4f060f91048df0b06451a753725c22d1644ae
3
+ size 1979798064
model-00006-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:537f0c416775d0f55450507c03a6fcfaf7ebbf281fd0352a28525ef12452e55c
3
+ size 1946227368
model-00007-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:759b85546161266476908dc45ed68cb9c3f27c2b16f0fc5e2bc85439d0058c13
3
+ size 1979798064
model-00008-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:49bc4a5933ffcb76efe63d8d4e08a202ba177c03268354f1ade3d7fe6f6308e5
3
+ size 1308683424
model-00009-of-00009.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a7d4ebb650ad5dc8b6b8de3bd7b427d5aa976066a1e2e71e76e4b2006ba56fa
3
+ size 1050673280
model.safetensors.index.json ADDED
@@ -0,0 +1,298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 16060522496
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00009-of-00009.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00009.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00009.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00009.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00009.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00009.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00009.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00009.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00009.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00009.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00009.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00009.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00009.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00009.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00009.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00009.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00009.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00009.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00009.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00009.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00003-of-00009.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00003-of-00009.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00003-of-00009.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00003-of-00009.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00003-of-00009.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00003-of-00009.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00003-of-00009.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00003-of-00009.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00003-of-00009.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00004-of-00009.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00004-of-00009.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00004-of-00009.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00004-of-00009.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00004-of-00009.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00004-of-00009.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00004-of-00009.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00004-of-00009.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00004-of-00009.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00004-of-00009.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00004-of-00009.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00004-of-00009.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00004-of-00009.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00004-of-00009.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00004-of-00009.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00004-of-00009.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00004-of-00009.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00004-of-00009.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00004-of-00009.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00004-of-00009.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00004-of-00009.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00004-of-00009.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00004-of-00009.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00004-of-00009.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00004-of-00009.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00004-of-00009.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00004-of-00009.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00004-of-00009.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00004-of-00009.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00004-of-00009.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00004-of-00009.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00004-of-00009.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00004-of-00009.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00004-of-00009.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00004-of-00009.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00004-of-00009.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00005-of-00009.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00005-of-00009.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00004-of-00009.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00005-of-00009.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00005-of-00009.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00004-of-00009.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00004-of-00009.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00004-of-00009.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00004-of-00009.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00005-of-00009.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00005-of-00009.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00005-of-00009.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00005-of-00009.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00005-of-00009.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00005-of-00009.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00005-of-00009.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00005-of-00009.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00005-of-00009.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00005-of-00009.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00005-of-00009.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00005-of-00009.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00005-of-00009.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00005-of-00009.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00005-of-00009.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00005-of-00009.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00005-of-00009.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00005-of-00009.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00005-of-00009.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00005-of-00009.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00005-of-00009.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00005-of-00009.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00005-of-00009.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00005-of-00009.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00005-of-00009.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00005-of-00009.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00005-of-00009.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00005-of-00009.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00005-of-00009.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00005-of-00009.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00005-of-00009.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00005-of-00009.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00005-of-00009.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00005-of-00009.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00005-of-00009.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00005-of-00009.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00002-of-00009.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00002-of-00009.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00002-of-00009.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00002-of-00009.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00002-of-00009.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00009.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00002-of-00009.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00009.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00009.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00006-of-00009.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00006-of-00009.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00006-of-00009.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00006-of-00009.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00006-of-00009.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00006-of-00009.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00006-of-00009.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00006-of-00009.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00006-of-00009.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00006-of-00009.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00006-of-00009.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00006-of-00009.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00006-of-00009.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00006-of-00009.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00006-of-00009.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00006-of-00009.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00006-of-00009.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00006-of-00009.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00006-of-00009.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00006-of-00009.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00006-of-00009.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00006-of-00009.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00006-of-00009.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00006-of-00009.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00006-of-00009.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00006-of-00009.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00006-of-00009.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00006-of-00009.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00006-of-00009.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00006-of-00009.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00006-of-00009.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00006-of-00009.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00006-of-00009.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00006-of-00009.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00006-of-00009.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00006-of-00009.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00007-of-00009.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00007-of-00009.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00006-of-00009.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00007-of-00009.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00007-of-00009.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00006-of-00009.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00006-of-00009.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00006-of-00009.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00006-of-00009.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00007-of-00009.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00007-of-00009.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00007-of-00009.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00007-of-00009.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00007-of-00009.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00007-of-00009.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00007-of-00009.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00007-of-00009.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00007-of-00009.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00007-of-00009.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00007-of-00009.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00007-of-00009.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00007-of-00009.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00007-of-00009.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00007-of-00009.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00007-of-00009.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00007-of-00009.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00007-of-00009.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00007-of-00009.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00007-of-00009.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00007-of-00009.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00007-of-00009.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00007-of-00009.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00007-of-00009.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00007-of-00009.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00007-of-00009.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00007-of-00009.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00007-of-00009.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00007-of-00009.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00007-of-00009.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00007-of-00009.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00007-of-00009.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00007-of-00009.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00007-of-00009.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00007-of-00009.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00007-of-00009.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00008-of-00009.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00008-of-00009.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00008-of-00009.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00008-of-00009.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00008-of-00009.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00008-of-00009.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00008-of-00009.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00008-of-00009.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00008-of-00009.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00002-of-00009.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00002-of-00009.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00002-of-00009.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00002-of-00009.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00002-of-00009.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00002-of-00009.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00002-of-00009.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00002-of-00009.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00002-of-00009.safetensors",
224
+ "model.layers.30.input_layernorm.weight": "model-00008-of-00009.safetensors",
225
+ "model.layers.30.mlp.down_proj.weight": "model-00008-of-00009.safetensors",
226
+ "model.layers.30.mlp.gate_proj.weight": "model-00008-of-00009.safetensors",
227
+ "model.layers.30.mlp.up_proj.weight": "model-00008-of-00009.safetensors",
228
+ "model.layers.30.post_attention_layernorm.weight": "model-00008-of-00009.safetensors",
229
+ "model.layers.30.self_attn.k_proj.weight": "model-00008-of-00009.safetensors",
230
+ "model.layers.30.self_attn.o_proj.weight": "model-00008-of-00009.safetensors",
231
+ "model.layers.30.self_attn.q_proj.weight": "model-00008-of-00009.safetensors",
232
+ "model.layers.30.self_attn.v_proj.weight": "model-00008-of-00009.safetensors",
233
+ "model.layers.31.input_layernorm.weight": "model-00008-of-00009.safetensors",
234
+ "model.layers.31.mlp.down_proj.weight": "model-00008-of-00009.safetensors",
235
+ "model.layers.31.mlp.gate_proj.weight": "model-00008-of-00009.safetensors",
236
+ "model.layers.31.mlp.up_proj.weight": "model-00008-of-00009.safetensors",
237
+ "model.layers.31.post_attention_layernorm.weight": "model-00008-of-00009.safetensors",
238
+ "model.layers.31.self_attn.k_proj.weight": "model-00008-of-00009.safetensors",
239
+ "model.layers.31.self_attn.o_proj.weight": "model-00008-of-00009.safetensors",
240
+ "model.layers.31.self_attn.q_proj.weight": "model-00008-of-00009.safetensors",
241
+ "model.layers.31.self_attn.v_proj.weight": "model-00008-of-00009.safetensors",
242
+ "model.layers.4.input_layernorm.weight": "model-00002-of-00009.safetensors",
243
+ "model.layers.4.mlp.down_proj.weight": "model-00002-of-00009.safetensors",
244
+ "model.layers.4.mlp.gate_proj.weight": "model-00002-of-00009.safetensors",
245
+ "model.layers.4.mlp.up_proj.weight": "model-00002-of-00009.safetensors",
246
+ "model.layers.4.post_attention_layernorm.weight": "model-00002-of-00009.safetensors",
247
+ "model.layers.4.self_attn.k_proj.weight": "model-00002-of-00009.safetensors",
248
+ "model.layers.4.self_attn.o_proj.weight": "model-00002-of-00009.safetensors",
249
+ "model.layers.4.self_attn.q_proj.weight": "model-00002-of-00009.safetensors",
250
+ "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00009.safetensors",
251
+ "model.layers.5.input_layernorm.weight": "model-00002-of-00009.safetensors",
252
+ "model.layers.5.mlp.down_proj.weight": "model-00002-of-00009.safetensors",
253
+ "model.layers.5.mlp.gate_proj.weight": "model-00002-of-00009.safetensors",
254
+ "model.layers.5.mlp.up_proj.weight": "model-00002-of-00009.safetensors",
255
+ "model.layers.5.post_attention_layernorm.weight": "model-00002-of-00009.safetensors",
256
+ "model.layers.5.self_attn.k_proj.weight": "model-00002-of-00009.safetensors",
257
+ "model.layers.5.self_attn.o_proj.weight": "model-00002-of-00009.safetensors",
258
+ "model.layers.5.self_attn.q_proj.weight": "model-00002-of-00009.safetensors",
259
+ "model.layers.5.self_attn.v_proj.weight": "model-00002-of-00009.safetensors",
260
+ "model.layers.6.input_layernorm.weight": "model-00003-of-00009.safetensors",
261
+ "model.layers.6.mlp.down_proj.weight": "model-00003-of-00009.safetensors",
262
+ "model.layers.6.mlp.gate_proj.weight": "model-00002-of-00009.safetensors",
263
+ "model.layers.6.mlp.up_proj.weight": "model-00003-of-00009.safetensors",
264
+ "model.layers.6.post_attention_layernorm.weight": "model-00003-of-00009.safetensors",
265
+ "model.layers.6.self_attn.k_proj.weight": "model-00002-of-00009.safetensors",
266
+ "model.layers.6.self_attn.o_proj.weight": "model-00002-of-00009.safetensors",
267
+ "model.layers.6.self_attn.q_proj.weight": "model-00002-of-00009.safetensors",
268
+ "model.layers.6.self_attn.v_proj.weight": "model-00002-of-00009.safetensors",
269
+ "model.layers.7.input_layernorm.weight": "model-00003-of-00009.safetensors",
270
+ "model.layers.7.mlp.down_proj.weight": "model-00003-of-00009.safetensors",
271
+ "model.layers.7.mlp.gate_proj.weight": "model-00003-of-00009.safetensors",
272
+ "model.layers.7.mlp.up_proj.weight": "model-00003-of-00009.safetensors",
273
+ "model.layers.7.post_attention_layernorm.weight": "model-00003-of-00009.safetensors",
274
+ "model.layers.7.self_attn.k_proj.weight": "model-00003-of-00009.safetensors",
275
+ "model.layers.7.self_attn.o_proj.weight": "model-00003-of-00009.safetensors",
276
+ "model.layers.7.self_attn.q_proj.weight": "model-00003-of-00009.safetensors",
277
+ "model.layers.7.self_attn.v_proj.weight": "model-00003-of-00009.safetensors",
278
+ "model.layers.8.input_layernorm.weight": "model-00003-of-00009.safetensors",
279
+ "model.layers.8.mlp.down_proj.weight": "model-00003-of-00009.safetensors",
280
+ "model.layers.8.mlp.gate_proj.weight": "model-00003-of-00009.safetensors",
281
+ "model.layers.8.mlp.up_proj.weight": "model-00003-of-00009.safetensors",
282
+ "model.layers.8.post_attention_layernorm.weight": "model-00003-of-00009.safetensors",
283
+ "model.layers.8.self_attn.k_proj.weight": "model-00003-of-00009.safetensors",
284
+ "model.layers.8.self_attn.o_proj.weight": "model-00003-of-00009.safetensors",
285
+ "model.layers.8.self_attn.q_proj.weight": "model-00003-of-00009.safetensors",
286
+ "model.layers.8.self_attn.v_proj.weight": "model-00003-of-00009.safetensors",
287
+ "model.layers.9.input_layernorm.weight": "model-00003-of-00009.safetensors",
288
+ "model.layers.9.mlp.down_proj.weight": "model-00003-of-00009.safetensors",
289
+ "model.layers.9.mlp.gate_proj.weight": "model-00003-of-00009.safetensors",
290
+ "model.layers.9.mlp.up_proj.weight": "model-00003-of-00009.safetensors",
291
+ "model.layers.9.post_attention_layernorm.weight": "model-00003-of-00009.safetensors",
292
+ "model.layers.9.self_attn.k_proj.weight": "model-00003-of-00009.safetensors",
293
+ "model.layers.9.self_attn.o_proj.weight": "model-00003-of-00009.safetensors",
294
+ "model.layers.9.self_attn.q_proj.weight": "model-00003-of-00009.safetensors",
295
+ "model.layers.9.self_attn.v_proj.weight": "model-00003-of-00009.safetensors",
296
+ "model.norm.weight": "model-00008-of-00009.safetensors"
297
+ }
298
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|bos|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|eos|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|pad|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<|unk|>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,2066 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "128000": {
6
+ "content": "<|bos|>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "128001": {
14
+ "content": "<|eos|>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "128002": {
22
+ "content": "<|unk|>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ },
29
+ "128003": {
30
+ "content": "<|pad|>",
31
+ "lstrip": false,
32
+ "normalized": false,
33
+ "rstrip": false,
34
+ "single_word": false,
35
+ "special": true
36
+ },
37
+ "128004": {
38
+ "content": "<|reserved_special_token_2|>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false,
43
+ "special": true
44
+ },
45
+ "128005": {
46
+ "content": "<|reserved_special_token_3|>",
47
+ "lstrip": false,
48
+ "normalized": false,
49
+ "rstrip": false,
50
+ "single_word": false,
51
+ "special": true
52
+ },
53
+ "128006": {
54
+ "content": "<|role:begin|>",
55
+ "lstrip": false,
56
+ "normalized": false,
57
+ "rstrip": false,
58
+ "single_word": false,
59
+ "special": true
60
+ },
61
+ "128007": {
62
+ "content": "<|role:end|>",
63
+ "lstrip": false,
64
+ "normalized": false,
65
+ "rstrip": false,
66
+ "single_word": false,
67
+ "special": true
68
+ },
69
+ "128008": {
70
+ "content": "<|reserved_special_token_4|>",
71
+ "lstrip": false,
72
+ "normalized": false,
73
+ "rstrip": false,
74
+ "single_word": false,
75
+ "special": true
76
+ },
77
+ "128009": {
78
+ "content": "<|cos|>",
79
+ "lstrip": false,
80
+ "normalized": false,
81
+ "rstrip": false,
82
+ "single_word": false,
83
+ "special": true
84
+ },
85
+ "128010": {
86
+ "content": "<|tool:execute|>",
87
+ "lstrip": false,
88
+ "normalized": false,
89
+ "rstrip": false,
90
+ "single_word": false,
91
+ "special": true
92
+ },
93
+ "128011": {
94
+ "content": "<|tool:results|>",
95
+ "lstrip": false,
96
+ "normalized": false,
97
+ "rstrip": false,
98
+ "single_word": false,
99
+ "special": true
100
+ },
101
+ "128012": {
102
+ "content": "<|reserved_special_token_7|>",
103
+ "lstrip": false,
104
+ "normalized": false,
105
+ "rstrip": false,
106
+ "single_word": false,
107
+ "special": true
108
+ },
109
+ "128013": {
110
+ "content": "<|reserved_special_token_8|>",
111
+ "lstrip": false,
112
+ "normalized": false,
113
+ "rstrip": false,
114
+ "single_word": false,
115
+ "special": true
116
+ },
117
+ "128014": {
118
+ "content": "<|reserved_special_token_9|>",
119
+ "lstrip": false,
120
+ "normalized": false,
121
+ "rstrip": false,
122
+ "single_word": false,
123
+ "special": true
124
+ },
125
+ "128015": {
126
+ "content": "<|attachment:begin|>",
127
+ "lstrip": false,
128
+ "normalized": false,
129
+ "rstrip": false,
130
+ "single_word": false,
131
+ "special": true
132
+ },
133
+ "128016": {
134
+ "content": "<|attachment:end|>",
135
+ "lstrip": false,
136
+ "normalized": false,
137
+ "rstrip": false,
138
+ "single_word": false,
139
+ "special": true
140
+ },
141
+ "128017": {
142
+ "content": "<|attachment:image|>",
143
+ "lstrip": false,
144
+ "normalized": false,
145
+ "rstrip": false,
146
+ "single_word": false,
147
+ "special": true
148
+ },
149
+ "128018": {
150
+ "content": "<|attachment:audio|>",
151
+ "lstrip": false,
152
+ "normalized": false,
153
+ "rstrip": false,
154
+ "single_word": false,
155
+ "special": true
156
+ },
157
+ "128019": {
158
+ "content": "<|attachment:video|>",
159
+ "lstrip": false,
160
+ "normalized": false,
161
+ "rstrip": false,
162
+ "single_word": false,
163
+ "special": true
164
+ },
165
+ "128020": {
166
+ "content": "<|reserved_special_token_15|>",
167
+ "lstrip": false,
168
+ "normalized": false,
169
+ "rstrip": false,
170
+ "single_word": false,
171
+ "special": true
172
+ },
173
+ "128021": {
174
+ "content": "<|reserved_special_token_16|>",
175
+ "lstrip": false,
176
+ "normalized": false,
177
+ "rstrip": false,
178
+ "single_word": false,
179
+ "special": true
180
+ },
181
+ "128022": {
182
+ "content": "<|reserved_special_token_17|>",
183
+ "lstrip": false,
184
+ "normalized": false,
185
+ "rstrip": false,
186
+ "single_word": false,
187
+ "special": true
188
+ },
189
+ "128023": {
190
+ "content": "<|reserved_special_token_18|>",
191
+ "lstrip": false,
192
+ "normalized": false,
193
+ "rstrip": false,
194
+ "single_word": false,
195
+ "special": true
196
+ },
197
+ "128024": {
198
+ "content": "<|reserved_special_token_19|>",
199
+ "lstrip": false,
200
+ "normalized": false,
201
+ "rstrip": false,
202
+ "single_word": false,
203
+ "special": true
204
+ },
205
+ "128025": {
206
+ "content": "<|reply:begin|>",
207
+ "lstrip": false,
208
+ "normalized": false,
209
+ "rstrip": false,
210
+ "single_word": false,
211
+ "special": true
212
+ },
213
+ "128026": {
214
+ "content": "<|reply:end|>",
215
+ "lstrip": false,
216
+ "normalized": false,
217
+ "rstrip": false,
218
+ "single_word": false,
219
+ "special": true
220
+ },
221
+ "128027": {
222
+ "content": "<|reserved_special_token_22|>",
223
+ "lstrip": false,
224
+ "normalized": false,
225
+ "rstrip": false,
226
+ "single_word": false,
227
+ "special": true
228
+ },
229
+ "128028": {
230
+ "content": "<|reserved_special_token_23|>",
231
+ "lstrip": false,
232
+ "normalized": false,
233
+ "rstrip": false,
234
+ "single_word": false,
235
+ "special": true
236
+ },
237
+ "128029": {
238
+ "content": "<|reserved_special_token_24|>",
239
+ "lstrip": false,
240
+ "normalized": false,
241
+ "rstrip": false,
242
+ "single_word": false,
243
+ "special": true
244
+ },
245
+ "128030": {
246
+ "content": "<|reserved_special_token_25|>",
247
+ "lstrip": false,
248
+ "normalized": false,
249
+ "rstrip": false,
250
+ "single_word": false,
251
+ "special": true
252
+ },
253
+ "128031": {
254
+ "content": "<|reserved_special_token_26|>",
255
+ "lstrip": false,
256
+ "normalized": false,
257
+ "rstrip": false,
258
+ "single_word": false,
259
+ "special": true
260
+ },
261
+ "128032": {
262
+ "content": "<|reserved_special_token_27|>",
263
+ "lstrip": false,
264
+ "normalized": false,
265
+ "rstrip": false,
266
+ "single_word": false,
267
+ "special": true
268
+ },
269
+ "128033": {
270
+ "content": "<|reserved_special_token_28|>",
271
+ "lstrip": false,
272
+ "normalized": false,
273
+ "rstrip": false,
274
+ "single_word": false,
275
+ "special": true
276
+ },
277
+ "128034": {
278
+ "content": "<|reserved_special_token_29|>",
279
+ "lstrip": false,
280
+ "normalized": false,
281
+ "rstrip": false,
282
+ "single_word": false,
283
+ "special": true
284
+ },
285
+ "128035": {
286
+ "content": "<|reserved_special_token_30|>",
287
+ "lstrip": false,
288
+ "normalized": false,
289
+ "rstrip": false,
290
+ "single_word": false,
291
+ "special": true
292
+ },
293
+ "128036": {
294
+ "content": "<|reserved_special_token_31|>",
295
+ "lstrip": false,
296
+ "normalized": false,
297
+ "rstrip": false,
298
+ "single_word": false,
299
+ "special": true
300
+ },
301
+ "128037": {
302
+ "content": "<|reserved_special_token_32|>",
303
+ "lstrip": false,
304
+ "normalized": false,
305
+ "rstrip": false,
306
+ "single_word": false,
307
+ "special": true
308
+ },
309
+ "128038": {
310
+ "content": "<|reserved_special_token_33|>",
311
+ "lstrip": false,
312
+ "normalized": false,
313
+ "rstrip": false,
314
+ "single_word": false,
315
+ "special": true
316
+ },
317
+ "128039": {
318
+ "content": "<|reserved_special_token_34|>",
319
+ "lstrip": false,
320
+ "normalized": false,
321
+ "rstrip": false,
322
+ "single_word": false,
323
+ "special": true
324
+ },
325
+ "128040": {
326
+ "content": "<|reserved_special_token_35|>",
327
+ "lstrip": false,
328
+ "normalized": false,
329
+ "rstrip": false,
330
+ "single_word": false,
331
+ "special": true
332
+ },
333
+ "128041": {
334
+ "content": "<|reserved_special_token_36|>",
335
+ "lstrip": false,
336
+ "normalized": false,
337
+ "rstrip": false,
338
+ "single_word": false,
339
+ "special": true
340
+ },
341
+ "128042": {
342
+ "content": "<|reserved_special_token_37|>",
343
+ "lstrip": false,
344
+ "normalized": false,
345
+ "rstrip": false,
346
+ "single_word": false,
347
+ "special": true
348
+ },
349
+ "128043": {
350
+ "content": "<|reserved_special_token_38|>",
351
+ "lstrip": false,
352
+ "normalized": false,
353
+ "rstrip": false,
354
+ "single_word": false,
355
+ "special": true
356
+ },
357
+ "128044": {
358
+ "content": "<|reserved_special_token_39|>",
359
+ "lstrip": false,
360
+ "normalized": false,
361
+ "rstrip": false,
362
+ "single_word": false,
363
+ "special": true
364
+ },
365
+ "128045": {
366
+ "content": "<|reserved_special_token_40|>",
367
+ "lstrip": false,
368
+ "normalized": false,
369
+ "rstrip": false,
370
+ "single_word": false,
371
+ "special": true
372
+ },
373
+ "128046": {
374
+ "content": "<|reserved_special_token_41|>",
375
+ "lstrip": false,
376
+ "normalized": false,
377
+ "rstrip": false,
378
+ "single_word": false,
379
+ "special": true
380
+ },
381
+ "128047": {
382
+ "content": "<|reserved_special_token_42|>",
383
+ "lstrip": false,
384
+ "normalized": false,
385
+ "rstrip": false,
386
+ "single_word": false,
387
+ "special": true
388
+ },
389
+ "128048": {
390
+ "content": "<|reserved_special_token_43|>",
391
+ "lstrip": false,
392
+ "normalized": false,
393
+ "rstrip": false,
394
+ "single_word": false,
395
+ "special": true
396
+ },
397
+ "128049": {
398
+ "content": "<|reserved_special_token_44|>",
399
+ "lstrip": false,
400
+ "normalized": false,
401
+ "rstrip": false,
402
+ "single_word": false,
403
+ "special": true
404
+ },
405
+ "128050": {
406
+ "content": "<|reserved_special_token_45|>",
407
+ "lstrip": false,
408
+ "normalized": false,
409
+ "rstrip": false,
410
+ "single_word": false,
411
+ "special": true
412
+ },
413
+ "128051": {
414
+ "content": "<|reserved_special_token_46|>",
415
+ "lstrip": false,
416
+ "normalized": false,
417
+ "rstrip": false,
418
+ "single_word": false,
419
+ "special": true
420
+ },
421
+ "128052": {
422
+ "content": "<|reserved_special_token_47|>",
423
+ "lstrip": false,
424
+ "normalized": false,
425
+ "rstrip": false,
426
+ "single_word": false,
427
+ "special": true
428
+ },
429
+ "128053": {
430
+ "content": "<|reserved_special_token_48|>",
431
+ "lstrip": false,
432
+ "normalized": false,
433
+ "rstrip": false,
434
+ "single_word": false,
435
+ "special": true
436
+ },
437
+ "128054": {
438
+ "content": "<|reserved_special_token_49|>",
439
+ "lstrip": false,
440
+ "normalized": false,
441
+ "rstrip": false,
442
+ "single_word": false,
443
+ "special": true
444
+ },
445
+ "128055": {
446
+ "content": "<|reserved_special_token_50|>",
447
+ "lstrip": false,
448
+ "normalized": false,
449
+ "rstrip": false,
450
+ "single_word": false,
451
+ "special": true
452
+ },
453
+ "128056": {
454
+ "content": "<|reserved_special_token_51|>",
455
+ "lstrip": false,
456
+ "normalized": false,
457
+ "rstrip": false,
458
+ "single_word": false,
459
+ "special": true
460
+ },
461
+ "128057": {
462
+ "content": "<|reserved_special_token_52|>",
463
+ "lstrip": false,
464
+ "normalized": false,
465
+ "rstrip": false,
466
+ "single_word": false,
467
+ "special": true
468
+ },
469
+ "128058": {
470
+ "content": "<|reserved_special_token_53|>",
471
+ "lstrip": false,
472
+ "normalized": false,
473
+ "rstrip": false,
474
+ "single_word": false,
475
+ "special": true
476
+ },
477
+ "128059": {
478
+ "content": "<|reserved_special_token_54|>",
479
+ "lstrip": false,
480
+ "normalized": false,
481
+ "rstrip": false,
482
+ "single_word": false,
483
+ "special": true
484
+ },
485
+ "128060": {
486
+ "content": "<|reserved_special_token_55|>",
487
+ "lstrip": false,
488
+ "normalized": false,
489
+ "rstrip": false,
490
+ "single_word": false,
491
+ "special": true
492
+ },
493
+ "128061": {
494
+ "content": "<|reserved_special_token_56|>",
495
+ "lstrip": false,
496
+ "normalized": false,
497
+ "rstrip": false,
498
+ "single_word": false,
499
+ "special": true
500
+ },
501
+ "128062": {
502
+ "content": "<|reserved_special_token_57|>",
503
+ "lstrip": false,
504
+ "normalized": false,
505
+ "rstrip": false,
506
+ "single_word": false,
507
+ "special": true
508
+ },
509
+ "128063": {
510
+ "content": "<|reserved_special_token_58|>",
511
+ "lstrip": false,
512
+ "normalized": false,
513
+ "rstrip": false,
514
+ "single_word": false,
515
+ "special": true
516
+ },
517
+ "128064": {
518
+ "content": "<|reserved_special_token_59|>",
519
+ "lstrip": false,
520
+ "normalized": false,
521
+ "rstrip": false,
522
+ "single_word": false,
523
+ "special": true
524
+ },
525
+ "128065": {
526
+ "content": "<|reserved_special_token_60|>",
527
+ "lstrip": false,
528
+ "normalized": false,
529
+ "rstrip": false,
530
+ "single_word": false,
531
+ "special": true
532
+ },
533
+ "128066": {
534
+ "content": "<|reserved_special_token_61|>",
535
+ "lstrip": false,
536
+ "normalized": false,
537
+ "rstrip": false,
538
+ "single_word": false,
539
+ "special": true
540
+ },
541
+ "128067": {
542
+ "content": "<|reserved_special_token_62|>",
543
+ "lstrip": false,
544
+ "normalized": false,
545
+ "rstrip": false,
546
+ "single_word": false,
547
+ "special": true
548
+ },
549
+ "128068": {
550
+ "content": "<|reserved_special_token_63|>",
551
+ "lstrip": false,
552
+ "normalized": false,
553
+ "rstrip": false,
554
+ "single_word": false,
555
+ "special": true
556
+ },
557
+ "128069": {
558
+ "content": "<|reserved_special_token_64|>",
559
+ "lstrip": false,
560
+ "normalized": false,
561
+ "rstrip": false,
562
+ "single_word": false,
563
+ "special": true
564
+ },
565
+ "128070": {
566
+ "content": "<|reserved_special_token_65|>",
567
+ "lstrip": false,
568
+ "normalized": false,
569
+ "rstrip": false,
570
+ "single_word": false,
571
+ "special": true
572
+ },
573
+ "128071": {
574
+ "content": "<|reserved_special_token_66|>",
575
+ "lstrip": false,
576
+ "normalized": false,
577
+ "rstrip": false,
578
+ "single_word": false,
579
+ "special": true
580
+ },
581
+ "128072": {
582
+ "content": "<|reserved_special_token_67|>",
583
+ "lstrip": false,
584
+ "normalized": false,
585
+ "rstrip": false,
586
+ "single_word": false,
587
+ "special": true
588
+ },
589
+ "128073": {
590
+ "content": "<|reserved_special_token_68|>",
591
+ "lstrip": false,
592
+ "normalized": false,
593
+ "rstrip": false,
594
+ "single_word": false,
595
+ "special": true
596
+ },
597
+ "128074": {
598
+ "content": "<|reserved_special_token_69|>",
599
+ "lstrip": false,
600
+ "normalized": false,
601
+ "rstrip": false,
602
+ "single_word": false,
603
+ "special": true
604
+ },
605
+ "128075": {
606
+ "content": "<|reserved_special_token_70|>",
607
+ "lstrip": false,
608
+ "normalized": false,
609
+ "rstrip": false,
610
+ "single_word": false,
611
+ "special": true
612
+ },
613
+ "128076": {
614
+ "content": "<|reserved_special_token_71|>",
615
+ "lstrip": false,
616
+ "normalized": false,
617
+ "rstrip": false,
618
+ "single_word": false,
619
+ "special": true
620
+ },
621
+ "128077": {
622
+ "content": "<|reserved_special_token_72|>",
623
+ "lstrip": false,
624
+ "normalized": false,
625
+ "rstrip": false,
626
+ "single_word": false,
627
+ "special": true
628
+ },
629
+ "128078": {
630
+ "content": "<|reserved_special_token_73|>",
631
+ "lstrip": false,
632
+ "normalized": false,
633
+ "rstrip": false,
634
+ "single_word": false,
635
+ "special": true
636
+ },
637
+ "128079": {
638
+ "content": "<|reserved_special_token_74|>",
639
+ "lstrip": false,
640
+ "normalized": false,
641
+ "rstrip": false,
642
+ "single_word": false,
643
+ "special": true
644
+ },
645
+ "128080": {
646
+ "content": "<|reserved_special_token_75|>",
647
+ "lstrip": false,
648
+ "normalized": false,
649
+ "rstrip": false,
650
+ "single_word": false,
651
+ "special": true
652
+ },
653
+ "128081": {
654
+ "content": "<|reserved_special_token_76|>",
655
+ "lstrip": false,
656
+ "normalized": false,
657
+ "rstrip": false,
658
+ "single_word": false,
659
+ "special": true
660
+ },
661
+ "128082": {
662
+ "content": "<|reserved_special_token_77|>",
663
+ "lstrip": false,
664
+ "normalized": false,
665
+ "rstrip": false,
666
+ "single_word": false,
667
+ "special": true
668
+ },
669
+ "128083": {
670
+ "content": "<|reserved_special_token_78|>",
671
+ "lstrip": false,
672
+ "normalized": false,
673
+ "rstrip": false,
674
+ "single_word": false,
675
+ "special": true
676
+ },
677
+ "128084": {
678
+ "content": "<|reserved_special_token_79|>",
679
+ "lstrip": false,
680
+ "normalized": false,
681
+ "rstrip": false,
682
+ "single_word": false,
683
+ "special": true
684
+ },
685
+ "128085": {
686
+ "content": "<|reserved_special_token_80|>",
687
+ "lstrip": false,
688
+ "normalized": false,
689
+ "rstrip": false,
690
+ "single_word": false,
691
+ "special": true
692
+ },
693
+ "128086": {
694
+ "content": "<|reserved_special_token_81|>",
695
+ "lstrip": false,
696
+ "normalized": false,
697
+ "rstrip": false,
698
+ "single_word": false,
699
+ "special": true
700
+ },
701
+ "128087": {
702
+ "content": "<|reserved_special_token_82|>",
703
+ "lstrip": false,
704
+ "normalized": false,
705
+ "rstrip": false,
706
+ "single_word": false,
707
+ "special": true
708
+ },
709
+ "128088": {
710
+ "content": "<|reserved_special_token_83|>",
711
+ "lstrip": false,
712
+ "normalized": false,
713
+ "rstrip": false,
714
+ "single_word": false,
715
+ "special": true
716
+ },
717
+ "128089": {
718
+ "content": "<|reserved_special_token_84|>",
719
+ "lstrip": false,
720
+ "normalized": false,
721
+ "rstrip": false,
722
+ "single_word": false,
723
+ "special": true
724
+ },
725
+ "128090": {
726
+ "content": "<|reserved_special_token_85|>",
727
+ "lstrip": false,
728
+ "normalized": false,
729
+ "rstrip": false,
730
+ "single_word": false,
731
+ "special": true
732
+ },
733
+ "128091": {
734
+ "content": "<|reserved_special_token_86|>",
735
+ "lstrip": false,
736
+ "normalized": false,
737
+ "rstrip": false,
738
+ "single_word": false,
739
+ "special": true
740
+ },
741
+ "128092": {
742
+ "content": "<|reserved_special_token_87|>",
743
+ "lstrip": false,
744
+ "normalized": false,
745
+ "rstrip": false,
746
+ "single_word": false,
747
+ "special": true
748
+ },
749
+ "128093": {
750
+ "content": "<|reserved_special_token_88|>",
751
+ "lstrip": false,
752
+ "normalized": false,
753
+ "rstrip": false,
754
+ "single_word": false,
755
+ "special": true
756
+ },
757
+ "128094": {
758
+ "content": "<|reserved_special_token_89|>",
759
+ "lstrip": false,
760
+ "normalized": false,
761
+ "rstrip": false,
762
+ "single_word": false,
763
+ "special": true
764
+ },
765
+ "128095": {
766
+ "content": "<|reserved_special_token_90|>",
767
+ "lstrip": false,
768
+ "normalized": false,
769
+ "rstrip": false,
770
+ "single_word": false,
771
+ "special": true
772
+ },
773
+ "128096": {
774
+ "content": "<|reserved_special_token_91|>",
775
+ "lstrip": false,
776
+ "normalized": false,
777
+ "rstrip": false,
778
+ "single_word": false,
779
+ "special": true
780
+ },
781
+ "128097": {
782
+ "content": "<|reserved_special_token_92|>",
783
+ "lstrip": false,
784
+ "normalized": false,
785
+ "rstrip": false,
786
+ "single_word": false,
787
+ "special": true
788
+ },
789
+ "128098": {
790
+ "content": "<|reserved_special_token_93|>",
791
+ "lstrip": false,
792
+ "normalized": false,
793
+ "rstrip": false,
794
+ "single_word": false,
795
+ "special": true
796
+ },
797
+ "128099": {
798
+ "content": "<|reserved_special_token_94|>",
799
+ "lstrip": false,
800
+ "normalized": false,
801
+ "rstrip": false,
802
+ "single_word": false,
803
+ "special": true
804
+ },
805
+ "128100": {
806
+ "content": "<|reserved_special_token_95|>",
807
+ "lstrip": false,
808
+ "normalized": false,
809
+ "rstrip": false,
810
+ "single_word": false,
811
+ "special": true
812
+ },
813
+ "128101": {
814
+ "content": "<|reserved_special_token_96|>",
815
+ "lstrip": false,
816
+ "normalized": false,
817
+ "rstrip": false,
818
+ "single_word": false,
819
+ "special": true
820
+ },
821
+ "128102": {
822
+ "content": "<|reserved_special_token_97|>",
823
+ "lstrip": false,
824
+ "normalized": false,
825
+ "rstrip": false,
826
+ "single_word": false,
827
+ "special": true
828
+ },
829
+ "128103": {
830
+ "content": "<|reserved_special_token_98|>",
831
+ "lstrip": false,
832
+ "normalized": false,
833
+ "rstrip": false,
834
+ "single_word": false,
835
+ "special": true
836
+ },
837
+ "128104": {
838
+ "content": "<|reserved_special_token_99|>",
839
+ "lstrip": false,
840
+ "normalized": false,
841
+ "rstrip": false,
842
+ "single_word": false,
843
+ "special": true
844
+ },
845
+ "128105": {
846
+ "content": "<|reserved_special_token_100|>",
847
+ "lstrip": false,
848
+ "normalized": false,
849
+ "rstrip": false,
850
+ "single_word": false,
851
+ "special": true
852
+ },
853
+ "128106": {
854
+ "content": "<|reserved_special_token_101|>",
855
+ "lstrip": false,
856
+ "normalized": false,
857
+ "rstrip": false,
858
+ "single_word": false,
859
+ "special": true
860
+ },
861
+ "128107": {
862
+ "content": "<|reserved_special_token_102|>",
863
+ "lstrip": false,
864
+ "normalized": false,
865
+ "rstrip": false,
866
+ "single_word": false,
867
+ "special": true
868
+ },
869
+ "128108": {
870
+ "content": "<|reserved_special_token_103|>",
871
+ "lstrip": false,
872
+ "normalized": false,
873
+ "rstrip": false,
874
+ "single_word": false,
875
+ "special": true
876
+ },
877
+ "128109": {
878
+ "content": "<|reserved_special_token_104|>",
879
+ "lstrip": false,
880
+ "normalized": false,
881
+ "rstrip": false,
882
+ "single_word": false,
883
+ "special": true
884
+ },
885
+ "128110": {
886
+ "content": "<|reserved_special_token_105|>",
887
+ "lstrip": false,
888
+ "normalized": false,
889
+ "rstrip": false,
890
+ "single_word": false,
891
+ "special": true
892
+ },
893
+ "128111": {
894
+ "content": "<|reserved_special_token_106|>",
895
+ "lstrip": false,
896
+ "normalized": false,
897
+ "rstrip": false,
898
+ "single_word": false,
899
+ "special": true
900
+ },
901
+ "128112": {
902
+ "content": "<|reserved_special_token_107|>",
903
+ "lstrip": false,
904
+ "normalized": false,
905
+ "rstrip": false,
906
+ "single_word": false,
907
+ "special": true
908
+ },
909
+ "128113": {
910
+ "content": "<|reserved_special_token_108|>",
911
+ "lstrip": false,
912
+ "normalized": false,
913
+ "rstrip": false,
914
+ "single_word": false,
915
+ "special": true
916
+ },
917
+ "128114": {
918
+ "content": "<|reserved_special_token_109|>",
919
+ "lstrip": false,
920
+ "normalized": false,
921
+ "rstrip": false,
922
+ "single_word": false,
923
+ "special": true
924
+ },
925
+ "128115": {
926
+ "content": "<|reserved_special_token_110|>",
927
+ "lstrip": false,
928
+ "normalized": false,
929
+ "rstrip": false,
930
+ "single_word": false,
931
+ "special": true
932
+ },
933
+ "128116": {
934
+ "content": "<|reserved_special_token_111|>",
935
+ "lstrip": false,
936
+ "normalized": false,
937
+ "rstrip": false,
938
+ "single_word": false,
939
+ "special": true
940
+ },
941
+ "128117": {
942
+ "content": "<|reserved_special_token_112|>",
943
+ "lstrip": false,
944
+ "normalized": false,
945
+ "rstrip": false,
946
+ "single_word": false,
947
+ "special": true
948
+ },
949
+ "128118": {
950
+ "content": "<|reserved_special_token_113|>",
951
+ "lstrip": false,
952
+ "normalized": false,
953
+ "rstrip": false,
954
+ "single_word": false,
955
+ "special": true
956
+ },
957
+ "128119": {
958
+ "content": "<|reserved_special_token_114|>",
959
+ "lstrip": false,
960
+ "normalized": false,
961
+ "rstrip": false,
962
+ "single_word": false,
963
+ "special": true
964
+ },
965
+ "128120": {
966
+ "content": "<|reserved_special_token_115|>",
967
+ "lstrip": false,
968
+ "normalized": false,
969
+ "rstrip": false,
970
+ "single_word": false,
971
+ "special": true
972
+ },
973
+ "128121": {
974
+ "content": "<|reserved_special_token_116|>",
975
+ "lstrip": false,
976
+ "normalized": false,
977
+ "rstrip": false,
978
+ "single_word": false,
979
+ "special": true
980
+ },
981
+ "128122": {
982
+ "content": "<|reserved_special_token_117|>",
983
+ "lstrip": false,
984
+ "normalized": false,
985
+ "rstrip": false,
986
+ "single_word": false,
987
+ "special": true
988
+ },
989
+ "128123": {
990
+ "content": "<|reserved_special_token_118|>",
991
+ "lstrip": false,
992
+ "normalized": false,
993
+ "rstrip": false,
994
+ "single_word": false,
995
+ "special": true
996
+ },
997
+ "128124": {
998
+ "content": "<|reserved_special_token_119|>",
999
+ "lstrip": false,
1000
+ "normalized": false,
1001
+ "rstrip": false,
1002
+ "single_word": false,
1003
+ "special": true
1004
+ },
1005
+ "128125": {
1006
+ "content": "<|reserved_special_token_120|>",
1007
+ "lstrip": false,
1008
+ "normalized": false,
1009
+ "rstrip": false,
1010
+ "single_word": false,
1011
+ "special": true
1012
+ },
1013
+ "128126": {
1014
+ "content": "<|reserved_special_token_121|>",
1015
+ "lstrip": false,
1016
+ "normalized": false,
1017
+ "rstrip": false,
1018
+ "single_word": false,
1019
+ "special": true
1020
+ },
1021
+ "128127": {
1022
+ "content": "<|reserved_special_token_122|>",
1023
+ "lstrip": false,
1024
+ "normalized": false,
1025
+ "rstrip": false,
1026
+ "single_word": false,
1027
+ "special": true
1028
+ },
1029
+ "128128": {
1030
+ "content": "<|reserved_special_token_123|>",
1031
+ "lstrip": false,
1032
+ "normalized": false,
1033
+ "rstrip": false,
1034
+ "single_word": false,
1035
+ "special": true
1036
+ },
1037
+ "128129": {
1038
+ "content": "<|reserved_special_token_124|>",
1039
+ "lstrip": false,
1040
+ "normalized": false,
1041
+ "rstrip": false,
1042
+ "single_word": false,
1043
+ "special": true
1044
+ },
1045
+ "128130": {
1046
+ "content": "<|reserved_special_token_125|>",
1047
+ "lstrip": false,
1048
+ "normalized": false,
1049
+ "rstrip": false,
1050
+ "single_word": false,
1051
+ "special": true
1052
+ },
1053
+ "128131": {
1054
+ "content": "<|reserved_special_token_126|>",
1055
+ "lstrip": false,
1056
+ "normalized": false,
1057
+ "rstrip": false,
1058
+ "single_word": false,
1059
+ "special": true
1060
+ },
1061
+ "128132": {
1062
+ "content": "<|reserved_special_token_127|>",
1063
+ "lstrip": false,
1064
+ "normalized": false,
1065
+ "rstrip": false,
1066
+ "single_word": false,
1067
+ "special": true
1068
+ },
1069
+ "128133": {
1070
+ "content": "<|reserved_special_token_128|>",
1071
+ "lstrip": false,
1072
+ "normalized": false,
1073
+ "rstrip": false,
1074
+ "single_word": false,
1075
+ "special": true
1076
+ },
1077
+ "128134": {
1078
+ "content": "<|reserved_special_token_129|>",
1079
+ "lstrip": false,
1080
+ "normalized": false,
1081
+ "rstrip": false,
1082
+ "single_word": false,
1083
+ "special": true
1084
+ },
1085
+ "128135": {
1086
+ "content": "<|reserved_special_token_130|>",
1087
+ "lstrip": false,
1088
+ "normalized": false,
1089
+ "rstrip": false,
1090
+ "single_word": false,
1091
+ "special": true
1092
+ },
1093
+ "128136": {
1094
+ "content": "<|reserved_special_token_131|>",
1095
+ "lstrip": false,
1096
+ "normalized": false,
1097
+ "rstrip": false,
1098
+ "single_word": false,
1099
+ "special": true
1100
+ },
1101
+ "128137": {
1102
+ "content": "<|reserved_special_token_132|>",
1103
+ "lstrip": false,
1104
+ "normalized": false,
1105
+ "rstrip": false,
1106
+ "single_word": false,
1107
+ "special": true
1108
+ },
1109
+ "128138": {
1110
+ "content": "<|reserved_special_token_133|>",
1111
+ "lstrip": false,
1112
+ "normalized": false,
1113
+ "rstrip": false,
1114
+ "single_word": false,
1115
+ "special": true
1116
+ },
1117
+ "128139": {
1118
+ "content": "<|reserved_special_token_134|>",
1119
+ "lstrip": false,
1120
+ "normalized": false,
1121
+ "rstrip": false,
1122
+ "single_word": false,
1123
+ "special": true
1124
+ },
1125
+ "128140": {
1126
+ "content": "<|reserved_special_token_135|>",
1127
+ "lstrip": false,
1128
+ "normalized": false,
1129
+ "rstrip": false,
1130
+ "single_word": false,
1131
+ "special": true
1132
+ },
1133
+ "128141": {
1134
+ "content": "<|reserved_special_token_136|>",
1135
+ "lstrip": false,
1136
+ "normalized": false,
1137
+ "rstrip": false,
1138
+ "single_word": false,
1139
+ "special": true
1140
+ },
1141
+ "128142": {
1142
+ "content": "<|reserved_special_token_137|>",
1143
+ "lstrip": false,
1144
+ "normalized": false,
1145
+ "rstrip": false,
1146
+ "single_word": false,
1147
+ "special": true
1148
+ },
1149
+ "128143": {
1150
+ "content": "<|reserved_special_token_138|>",
1151
+ "lstrip": false,
1152
+ "normalized": false,
1153
+ "rstrip": false,
1154
+ "single_word": false,
1155
+ "special": true
1156
+ },
1157
+ "128144": {
1158
+ "content": "<|reserved_special_token_139|>",
1159
+ "lstrip": false,
1160
+ "normalized": false,
1161
+ "rstrip": false,
1162
+ "single_word": false,
1163
+ "special": true
1164
+ },
1165
+ "128145": {
1166
+ "content": "<|reserved_special_token_140|>",
1167
+ "lstrip": false,
1168
+ "normalized": false,
1169
+ "rstrip": false,
1170
+ "single_word": false,
1171
+ "special": true
1172
+ },
1173
+ "128146": {
1174
+ "content": "<|reserved_special_token_141|>",
1175
+ "lstrip": false,
1176
+ "normalized": false,
1177
+ "rstrip": false,
1178
+ "single_word": false,
1179
+ "special": true
1180
+ },
1181
+ "128147": {
1182
+ "content": "<|reserved_special_token_142|>",
1183
+ "lstrip": false,
1184
+ "normalized": false,
1185
+ "rstrip": false,
1186
+ "single_word": false,
1187
+ "special": true
1188
+ },
1189
+ "128148": {
1190
+ "content": "<|reserved_special_token_143|>",
1191
+ "lstrip": false,
1192
+ "normalized": false,
1193
+ "rstrip": false,
1194
+ "single_word": false,
1195
+ "special": true
1196
+ },
1197
+ "128149": {
1198
+ "content": "<|reserved_special_token_144|>",
1199
+ "lstrip": false,
1200
+ "normalized": false,
1201
+ "rstrip": false,
1202
+ "single_word": false,
1203
+ "special": true
1204
+ },
1205
+ "128150": {
1206
+ "content": "<|reserved_special_token_145|>",
1207
+ "lstrip": false,
1208
+ "normalized": false,
1209
+ "rstrip": false,
1210
+ "single_word": false,
1211
+ "special": true
1212
+ },
1213
+ "128151": {
1214
+ "content": "<|reserved_special_token_146|>",
1215
+ "lstrip": false,
1216
+ "normalized": false,
1217
+ "rstrip": false,
1218
+ "single_word": false,
1219
+ "special": true
1220
+ },
1221
+ "128152": {
1222
+ "content": "<|reserved_special_token_147|>",
1223
+ "lstrip": false,
1224
+ "normalized": false,
1225
+ "rstrip": false,
1226
+ "single_word": false,
1227
+ "special": true
1228
+ },
1229
+ "128153": {
1230
+ "content": "<|reserved_special_token_148|>",
1231
+ "lstrip": false,
1232
+ "normalized": false,
1233
+ "rstrip": false,
1234
+ "single_word": false,
1235
+ "special": true
1236
+ },
1237
+ "128154": {
1238
+ "content": "<|reserved_special_token_149|>",
1239
+ "lstrip": false,
1240
+ "normalized": false,
1241
+ "rstrip": false,
1242
+ "single_word": false,
1243
+ "special": true
1244
+ },
1245
+ "128155": {
1246
+ "content": "<|reserved_special_token_150|>",
1247
+ "lstrip": false,
1248
+ "normalized": false,
1249
+ "rstrip": false,
1250
+ "single_word": false,
1251
+ "special": true
1252
+ },
1253
+ "128156": {
1254
+ "content": "<|reserved_special_token_151|>",
1255
+ "lstrip": false,
1256
+ "normalized": false,
1257
+ "rstrip": false,
1258
+ "single_word": false,
1259
+ "special": true
1260
+ },
1261
+ "128157": {
1262
+ "content": "<|reserved_special_token_152|>",
1263
+ "lstrip": false,
1264
+ "normalized": false,
1265
+ "rstrip": false,
1266
+ "single_word": false,
1267
+ "special": true
1268
+ },
1269
+ "128158": {
1270
+ "content": "<|reserved_special_token_153|>",
1271
+ "lstrip": false,
1272
+ "normalized": false,
1273
+ "rstrip": false,
1274
+ "single_word": false,
1275
+ "special": true
1276
+ },
1277
+ "128159": {
1278
+ "content": "<|reserved_special_token_154|>",
1279
+ "lstrip": false,
1280
+ "normalized": false,
1281
+ "rstrip": false,
1282
+ "single_word": false,
1283
+ "special": true
1284
+ },
1285
+ "128160": {
1286
+ "content": "<|reserved_special_token_155|>",
1287
+ "lstrip": false,
1288
+ "normalized": false,
1289
+ "rstrip": false,
1290
+ "single_word": false,
1291
+ "special": true
1292
+ },
1293
+ "128161": {
1294
+ "content": "<|reserved_special_token_156|>",
1295
+ "lstrip": false,
1296
+ "normalized": false,
1297
+ "rstrip": false,
1298
+ "single_word": false,
1299
+ "special": true
1300
+ },
1301
+ "128162": {
1302
+ "content": "<|reserved_special_token_157|>",
1303
+ "lstrip": false,
1304
+ "normalized": false,
1305
+ "rstrip": false,
1306
+ "single_word": false,
1307
+ "special": true
1308
+ },
1309
+ "128163": {
1310
+ "content": "<|reserved_special_token_158|>",
1311
+ "lstrip": false,
1312
+ "normalized": false,
1313
+ "rstrip": false,
1314
+ "single_word": false,
1315
+ "special": true
1316
+ },
1317
+ "128164": {
1318
+ "content": "<|reserved_special_token_159|>",
1319
+ "lstrip": false,
1320
+ "normalized": false,
1321
+ "rstrip": false,
1322
+ "single_word": false,
1323
+ "special": true
1324
+ },
1325
+ "128165": {
1326
+ "content": "<|reserved_special_token_160|>",
1327
+ "lstrip": false,
1328
+ "normalized": false,
1329
+ "rstrip": false,
1330
+ "single_word": false,
1331
+ "special": true
1332
+ },
1333
+ "128166": {
1334
+ "content": "<|reserved_special_token_161|>",
1335
+ "lstrip": false,
1336
+ "normalized": false,
1337
+ "rstrip": false,
1338
+ "single_word": false,
1339
+ "special": true
1340
+ },
1341
+ "128167": {
1342
+ "content": "<|reserved_special_token_162|>",
1343
+ "lstrip": false,
1344
+ "normalized": false,
1345
+ "rstrip": false,
1346
+ "single_word": false,
1347
+ "special": true
1348
+ },
1349
+ "128168": {
1350
+ "content": "<|reserved_special_token_163|>",
1351
+ "lstrip": false,
1352
+ "normalized": false,
1353
+ "rstrip": false,
1354
+ "single_word": false,
1355
+ "special": true
1356
+ },
1357
+ "128169": {
1358
+ "content": "<|reserved_special_token_164|>",
1359
+ "lstrip": false,
1360
+ "normalized": false,
1361
+ "rstrip": false,
1362
+ "single_word": false,
1363
+ "special": true
1364
+ },
1365
+ "128170": {
1366
+ "content": "<|reserved_special_token_165|>",
1367
+ "lstrip": false,
1368
+ "normalized": false,
1369
+ "rstrip": false,
1370
+ "single_word": false,
1371
+ "special": true
1372
+ },
1373
+ "128171": {
1374
+ "content": "<|reserved_special_token_166|>",
1375
+ "lstrip": false,
1376
+ "normalized": false,
1377
+ "rstrip": false,
1378
+ "single_word": false,
1379
+ "special": true
1380
+ },
1381
+ "128172": {
1382
+ "content": "<|reserved_special_token_167|>",
1383
+ "lstrip": false,
1384
+ "normalized": false,
1385
+ "rstrip": false,
1386
+ "single_word": false,
1387
+ "special": true
1388
+ },
1389
+ "128173": {
1390
+ "content": "<|reserved_special_token_168|>",
1391
+ "lstrip": false,
1392
+ "normalized": false,
1393
+ "rstrip": false,
1394
+ "single_word": false,
1395
+ "special": true
1396
+ },
1397
+ "128174": {
1398
+ "content": "<|reserved_special_token_169|>",
1399
+ "lstrip": false,
1400
+ "normalized": false,
1401
+ "rstrip": false,
1402
+ "single_word": false,
1403
+ "special": true
1404
+ },
1405
+ "128175": {
1406
+ "content": "<|reserved_special_token_170|>",
1407
+ "lstrip": false,
1408
+ "normalized": false,
1409
+ "rstrip": false,
1410
+ "single_word": false,
1411
+ "special": true
1412
+ },
1413
+ "128176": {
1414
+ "content": "<|reserved_special_token_171|>",
1415
+ "lstrip": false,
1416
+ "normalized": false,
1417
+ "rstrip": false,
1418
+ "single_word": false,
1419
+ "special": true
1420
+ },
1421
+ "128177": {
1422
+ "content": "<|reserved_special_token_172|>",
1423
+ "lstrip": false,
1424
+ "normalized": false,
1425
+ "rstrip": false,
1426
+ "single_word": false,
1427
+ "special": true
1428
+ },
1429
+ "128178": {
1430
+ "content": "<|reserved_special_token_173|>",
1431
+ "lstrip": false,
1432
+ "normalized": false,
1433
+ "rstrip": false,
1434
+ "single_word": false,
1435
+ "special": true
1436
+ },
1437
+ "128179": {
1438
+ "content": "<|reserved_special_token_174|>",
1439
+ "lstrip": false,
1440
+ "normalized": false,
1441
+ "rstrip": false,
1442
+ "single_word": false,
1443
+ "special": true
1444
+ },
1445
+ "128180": {
1446
+ "content": "<|reserved_special_token_175|>",
1447
+ "lstrip": false,
1448
+ "normalized": false,
1449
+ "rstrip": false,
1450
+ "single_word": false,
1451
+ "special": true
1452
+ },
1453
+ "128181": {
1454
+ "content": "<|reserved_special_token_176|>",
1455
+ "lstrip": false,
1456
+ "normalized": false,
1457
+ "rstrip": false,
1458
+ "single_word": false,
1459
+ "special": true
1460
+ },
1461
+ "128182": {
1462
+ "content": "<|reserved_special_token_177|>",
1463
+ "lstrip": false,
1464
+ "normalized": false,
1465
+ "rstrip": false,
1466
+ "single_word": false,
1467
+ "special": true
1468
+ },
1469
+ "128183": {
1470
+ "content": "<|reserved_special_token_178|>",
1471
+ "lstrip": false,
1472
+ "normalized": false,
1473
+ "rstrip": false,
1474
+ "single_word": false,
1475
+ "special": true
1476
+ },
1477
+ "128184": {
1478
+ "content": "<|reserved_special_token_179|>",
1479
+ "lstrip": false,
1480
+ "normalized": false,
1481
+ "rstrip": false,
1482
+ "single_word": false,
1483
+ "special": true
1484
+ },
1485
+ "128185": {
1486
+ "content": "<|reserved_special_token_180|>",
1487
+ "lstrip": false,
1488
+ "normalized": false,
1489
+ "rstrip": false,
1490
+ "single_word": false,
1491
+ "special": true
1492
+ },
1493
+ "128186": {
1494
+ "content": "<|reserved_special_token_181|>",
1495
+ "lstrip": false,
1496
+ "normalized": false,
1497
+ "rstrip": false,
1498
+ "single_word": false,
1499
+ "special": true
1500
+ },
1501
+ "128187": {
1502
+ "content": "<|reserved_special_token_182|>",
1503
+ "lstrip": false,
1504
+ "normalized": false,
1505
+ "rstrip": false,
1506
+ "single_word": false,
1507
+ "special": true
1508
+ },
1509
+ "128188": {
1510
+ "content": "<|reserved_special_token_183|>",
1511
+ "lstrip": false,
1512
+ "normalized": false,
1513
+ "rstrip": false,
1514
+ "single_word": false,
1515
+ "special": true
1516
+ },
1517
+ "128189": {
1518
+ "content": "<|reserved_special_token_184|>",
1519
+ "lstrip": false,
1520
+ "normalized": false,
1521
+ "rstrip": false,
1522
+ "single_word": false,
1523
+ "special": true
1524
+ },
1525
+ "128190": {
1526
+ "content": "<|reserved_special_token_185|>",
1527
+ "lstrip": false,
1528
+ "normalized": false,
1529
+ "rstrip": false,
1530
+ "single_word": false,
1531
+ "special": true
1532
+ },
1533
+ "128191": {
1534
+ "content": "<|reserved_special_token_186|>",
1535
+ "lstrip": false,
1536
+ "normalized": false,
1537
+ "rstrip": false,
1538
+ "single_word": false,
1539
+ "special": true
1540
+ },
1541
+ "128192": {
1542
+ "content": "<|reserved_special_token_187|>",
1543
+ "lstrip": false,
1544
+ "normalized": false,
1545
+ "rstrip": false,
1546
+ "single_word": false,
1547
+ "special": true
1548
+ },
1549
+ "128193": {
1550
+ "content": "<|reserved_special_token_188|>",
1551
+ "lstrip": false,
1552
+ "normalized": false,
1553
+ "rstrip": false,
1554
+ "single_word": false,
1555
+ "special": true
1556
+ },
1557
+ "128194": {
1558
+ "content": "<|reserved_special_token_189|>",
1559
+ "lstrip": false,
1560
+ "normalized": false,
1561
+ "rstrip": false,
1562
+ "single_word": false,
1563
+ "special": true
1564
+ },
1565
+ "128195": {
1566
+ "content": "<|reserved_special_token_190|>",
1567
+ "lstrip": false,
1568
+ "normalized": false,
1569
+ "rstrip": false,
1570
+ "single_word": false,
1571
+ "special": true
1572
+ },
1573
+ "128196": {
1574
+ "content": "<|reserved_special_token_191|>",
1575
+ "lstrip": false,
1576
+ "normalized": false,
1577
+ "rstrip": false,
1578
+ "single_word": false,
1579
+ "special": true
1580
+ },
1581
+ "128197": {
1582
+ "content": "<|reserved_special_token_192|>",
1583
+ "lstrip": false,
1584
+ "normalized": false,
1585
+ "rstrip": false,
1586
+ "single_word": false,
1587
+ "special": true
1588
+ },
1589
+ "128198": {
1590
+ "content": "<|reserved_special_token_193|>",
1591
+ "lstrip": false,
1592
+ "normalized": false,
1593
+ "rstrip": false,
1594
+ "single_word": false,
1595
+ "special": true
1596
+ },
1597
+ "128199": {
1598
+ "content": "<|reserved_special_token_194|>",
1599
+ "lstrip": false,
1600
+ "normalized": false,
1601
+ "rstrip": false,
1602
+ "single_word": false,
1603
+ "special": true
1604
+ },
1605
+ "128200": {
1606
+ "content": "<|reserved_special_token_195|>",
1607
+ "lstrip": false,
1608
+ "normalized": false,
1609
+ "rstrip": false,
1610
+ "single_word": false,
1611
+ "special": true
1612
+ },
1613
+ "128201": {
1614
+ "content": "<|reserved_special_token_196|>",
1615
+ "lstrip": false,
1616
+ "normalized": false,
1617
+ "rstrip": false,
1618
+ "single_word": false,
1619
+ "special": true
1620
+ },
1621
+ "128202": {
1622
+ "content": "<|reserved_special_token_197|>",
1623
+ "lstrip": false,
1624
+ "normalized": false,
1625
+ "rstrip": false,
1626
+ "single_word": false,
1627
+ "special": true
1628
+ },
1629
+ "128203": {
1630
+ "content": "<|reserved_special_token_198|>",
1631
+ "lstrip": false,
1632
+ "normalized": false,
1633
+ "rstrip": false,
1634
+ "single_word": false,
1635
+ "special": true
1636
+ },
1637
+ "128204": {
1638
+ "content": "<|reserved_special_token_199|>",
1639
+ "lstrip": false,
1640
+ "normalized": false,
1641
+ "rstrip": false,
1642
+ "single_word": false,
1643
+ "special": true
1644
+ },
1645
+ "128205": {
1646
+ "content": "<|reserved_special_token_200|>",
1647
+ "lstrip": false,
1648
+ "normalized": false,
1649
+ "rstrip": false,
1650
+ "single_word": false,
1651
+ "special": true
1652
+ },
1653
+ "128206": {
1654
+ "content": "<|reserved_special_token_201|>",
1655
+ "lstrip": false,
1656
+ "normalized": false,
1657
+ "rstrip": false,
1658
+ "single_word": false,
1659
+ "special": true
1660
+ },
1661
+ "128207": {
1662
+ "content": "<|reserved_special_token_202|>",
1663
+ "lstrip": false,
1664
+ "normalized": false,
1665
+ "rstrip": false,
1666
+ "single_word": false,
1667
+ "special": true
1668
+ },
1669
+ "128208": {
1670
+ "content": "<|reserved_special_token_203|>",
1671
+ "lstrip": false,
1672
+ "normalized": false,
1673
+ "rstrip": false,
1674
+ "single_word": false,
1675
+ "special": true
1676
+ },
1677
+ "128209": {
1678
+ "content": "<|reserved_special_token_204|>",
1679
+ "lstrip": false,
1680
+ "normalized": false,
1681
+ "rstrip": false,
1682
+ "single_word": false,
1683
+ "special": true
1684
+ },
1685
+ "128210": {
1686
+ "content": "<|reserved_special_token_205|>",
1687
+ "lstrip": false,
1688
+ "normalized": false,
1689
+ "rstrip": false,
1690
+ "single_word": false,
1691
+ "special": true
1692
+ },
1693
+ "128211": {
1694
+ "content": "<|reserved_special_token_206|>",
1695
+ "lstrip": false,
1696
+ "normalized": false,
1697
+ "rstrip": false,
1698
+ "single_word": false,
1699
+ "special": true
1700
+ },
1701
+ "128212": {
1702
+ "content": "<|reserved_special_token_207|>",
1703
+ "lstrip": false,
1704
+ "normalized": false,
1705
+ "rstrip": false,
1706
+ "single_word": false,
1707
+ "special": true
1708
+ },
1709
+ "128213": {
1710
+ "content": "<|reserved_special_token_208|>",
1711
+ "lstrip": false,
1712
+ "normalized": false,
1713
+ "rstrip": false,
1714
+ "single_word": false,
1715
+ "special": true
1716
+ },
1717
+ "128214": {
1718
+ "content": "<|reserved_special_token_209|>",
1719
+ "lstrip": false,
1720
+ "normalized": false,
1721
+ "rstrip": false,
1722
+ "single_word": false,
1723
+ "special": true
1724
+ },
1725
+ "128215": {
1726
+ "content": "<|reserved_special_token_210|>",
1727
+ "lstrip": false,
1728
+ "normalized": false,
1729
+ "rstrip": false,
1730
+ "single_word": false,
1731
+ "special": true
1732
+ },
1733
+ "128216": {
1734
+ "content": "<|reserved_special_token_211|>",
1735
+ "lstrip": false,
1736
+ "normalized": false,
1737
+ "rstrip": false,
1738
+ "single_word": false,
1739
+ "special": true
1740
+ },
1741
+ "128217": {
1742
+ "content": "<|reserved_special_token_212|>",
1743
+ "lstrip": false,
1744
+ "normalized": false,
1745
+ "rstrip": false,
1746
+ "single_word": false,
1747
+ "special": true
1748
+ },
1749
+ "128218": {
1750
+ "content": "<|reserved_special_token_213|>",
1751
+ "lstrip": false,
1752
+ "normalized": false,
1753
+ "rstrip": false,
1754
+ "single_word": false,
1755
+ "special": true
1756
+ },
1757
+ "128219": {
1758
+ "content": "<|reserved_special_token_214|>",
1759
+ "lstrip": false,
1760
+ "normalized": false,
1761
+ "rstrip": false,
1762
+ "single_word": false,
1763
+ "special": true
1764
+ },
1765
+ "128220": {
1766
+ "content": "<|reserved_special_token_215|>",
1767
+ "lstrip": false,
1768
+ "normalized": false,
1769
+ "rstrip": false,
1770
+ "single_word": false,
1771
+ "special": true
1772
+ },
1773
+ "128221": {
1774
+ "content": "<|reserved_special_token_216|>",
1775
+ "lstrip": false,
1776
+ "normalized": false,
1777
+ "rstrip": false,
1778
+ "single_word": false,
1779
+ "special": true
1780
+ },
1781
+ "128222": {
1782
+ "content": "<|reserved_special_token_217|>",
1783
+ "lstrip": false,
1784
+ "normalized": false,
1785
+ "rstrip": false,
1786
+ "single_word": false,
1787
+ "special": true
1788
+ },
1789
+ "128223": {
1790
+ "content": "<|reserved_special_token_218|>",
1791
+ "lstrip": false,
1792
+ "normalized": false,
1793
+ "rstrip": false,
1794
+ "single_word": false,
1795
+ "special": true
1796
+ },
1797
+ "128224": {
1798
+ "content": "<|reserved_special_token_219|>",
1799
+ "lstrip": false,
1800
+ "normalized": false,
1801
+ "rstrip": false,
1802
+ "single_word": false,
1803
+ "special": true
1804
+ },
1805
+ "128225": {
1806
+ "content": "<|reserved_special_token_220|>",
1807
+ "lstrip": false,
1808
+ "normalized": false,
1809
+ "rstrip": false,
1810
+ "single_word": false,
1811
+ "special": true
1812
+ },
1813
+ "128226": {
1814
+ "content": "<|reserved_special_token_221|>",
1815
+ "lstrip": false,
1816
+ "normalized": false,
1817
+ "rstrip": false,
1818
+ "single_word": false,
1819
+ "special": true
1820
+ },
1821
+ "128227": {
1822
+ "content": "<|reserved_special_token_222|>",
1823
+ "lstrip": false,
1824
+ "normalized": false,
1825
+ "rstrip": false,
1826
+ "single_word": false,
1827
+ "special": true
1828
+ },
1829
+ "128228": {
1830
+ "content": "<|reserved_special_token_223|>",
1831
+ "lstrip": false,
1832
+ "normalized": false,
1833
+ "rstrip": false,
1834
+ "single_word": false,
1835
+ "special": true
1836
+ },
1837
+ "128229": {
1838
+ "content": "<|reserved_special_token_224|>",
1839
+ "lstrip": false,
1840
+ "normalized": false,
1841
+ "rstrip": false,
1842
+ "single_word": false,
1843
+ "special": true
1844
+ },
1845
+ "128230": {
1846
+ "content": "<|reserved_special_token_225|>",
1847
+ "lstrip": false,
1848
+ "normalized": false,
1849
+ "rstrip": false,
1850
+ "single_word": false,
1851
+ "special": true
1852
+ },
1853
+ "128231": {
1854
+ "content": "<|reserved_special_token_226|>",
1855
+ "lstrip": false,
1856
+ "normalized": false,
1857
+ "rstrip": false,
1858
+ "single_word": false,
1859
+ "special": true
1860
+ },
1861
+ "128232": {
1862
+ "content": "<|reserved_special_token_227|>",
1863
+ "lstrip": false,
1864
+ "normalized": false,
1865
+ "rstrip": false,
1866
+ "single_word": false,
1867
+ "special": true
1868
+ },
1869
+ "128233": {
1870
+ "content": "<|reserved_special_token_228|>",
1871
+ "lstrip": false,
1872
+ "normalized": false,
1873
+ "rstrip": false,
1874
+ "single_word": false,
1875
+ "special": true
1876
+ },
1877
+ "128234": {
1878
+ "content": "<|reserved_special_token_229|>",
1879
+ "lstrip": false,
1880
+ "normalized": false,
1881
+ "rstrip": false,
1882
+ "single_word": false,
1883
+ "special": true
1884
+ },
1885
+ "128235": {
1886
+ "content": "<|reserved_special_token_230|>",
1887
+ "lstrip": false,
1888
+ "normalized": false,
1889
+ "rstrip": false,
1890
+ "single_word": false,
1891
+ "special": true
1892
+ },
1893
+ "128236": {
1894
+ "content": "<|reserved_special_token_231|>",
1895
+ "lstrip": false,
1896
+ "normalized": false,
1897
+ "rstrip": false,
1898
+ "single_word": false,
1899
+ "special": true
1900
+ },
1901
+ "128237": {
1902
+ "content": "<|reserved_special_token_232|>",
1903
+ "lstrip": false,
1904
+ "normalized": false,
1905
+ "rstrip": false,
1906
+ "single_word": false,
1907
+ "special": true
1908
+ },
1909
+ "128238": {
1910
+ "content": "<|reserved_special_token_233|>",
1911
+ "lstrip": false,
1912
+ "normalized": false,
1913
+ "rstrip": false,
1914
+ "single_word": false,
1915
+ "special": true
1916
+ },
1917
+ "128239": {
1918
+ "content": "<|reserved_special_token_234|>",
1919
+ "lstrip": false,
1920
+ "normalized": false,
1921
+ "rstrip": false,
1922
+ "single_word": false,
1923
+ "special": true
1924
+ },
1925
+ "128240": {
1926
+ "content": "<|reserved_special_token_235|>",
1927
+ "lstrip": false,
1928
+ "normalized": false,
1929
+ "rstrip": false,
1930
+ "single_word": false,
1931
+ "special": true
1932
+ },
1933
+ "128241": {
1934
+ "content": "<|reserved_special_token_236|>",
1935
+ "lstrip": false,
1936
+ "normalized": false,
1937
+ "rstrip": false,
1938
+ "single_word": false,
1939
+ "special": true
1940
+ },
1941
+ "128242": {
1942
+ "content": "<|reserved_special_token_237|>",
1943
+ "lstrip": false,
1944
+ "normalized": false,
1945
+ "rstrip": false,
1946
+ "single_word": false,
1947
+ "special": true
1948
+ },
1949
+ "128243": {
1950
+ "content": "<|reserved_special_token_238|>",
1951
+ "lstrip": false,
1952
+ "normalized": false,
1953
+ "rstrip": false,
1954
+ "single_word": false,
1955
+ "special": true
1956
+ },
1957
+ "128244": {
1958
+ "content": "<|reserved_special_token_239|>",
1959
+ "lstrip": false,
1960
+ "normalized": false,
1961
+ "rstrip": false,
1962
+ "single_word": false,
1963
+ "special": true
1964
+ },
1965
+ "128245": {
1966
+ "content": "<|reserved_special_token_240|>",
1967
+ "lstrip": false,
1968
+ "normalized": false,
1969
+ "rstrip": false,
1970
+ "single_word": false,
1971
+ "special": true
1972
+ },
1973
+ "128246": {
1974
+ "content": "<|reserved_special_token_241|>",
1975
+ "lstrip": false,
1976
+ "normalized": false,
1977
+ "rstrip": false,
1978
+ "single_word": false,
1979
+ "special": true
1980
+ },
1981
+ "128247": {
1982
+ "content": "<|reserved_special_token_242|>",
1983
+ "lstrip": false,
1984
+ "normalized": false,
1985
+ "rstrip": false,
1986
+ "single_word": false,
1987
+ "special": true
1988
+ },
1989
+ "128248": {
1990
+ "content": "<|reserved_special_token_243|>",
1991
+ "lstrip": false,
1992
+ "normalized": false,
1993
+ "rstrip": false,
1994
+ "single_word": false,
1995
+ "special": true
1996
+ },
1997
+ "128249": {
1998
+ "content": "<|reserved_special_token_244|>",
1999
+ "lstrip": false,
2000
+ "normalized": false,
2001
+ "rstrip": false,
2002
+ "single_word": false,
2003
+ "special": true
2004
+ },
2005
+ "128250": {
2006
+ "content": "<|reserved_special_token_245|>",
2007
+ "lstrip": false,
2008
+ "normalized": false,
2009
+ "rstrip": false,
2010
+ "single_word": false,
2011
+ "special": true
2012
+ },
2013
+ "128251": {
2014
+ "content": "<|reserved_special_token_246|>",
2015
+ "lstrip": false,
2016
+ "normalized": false,
2017
+ "rstrip": false,
2018
+ "single_word": false,
2019
+ "special": true
2020
+ },
2021
+ "128252": {
2022
+ "content": "<|reserved_special_token_247|>",
2023
+ "lstrip": false,
2024
+ "normalized": false,
2025
+ "rstrip": false,
2026
+ "single_word": false,
2027
+ "special": true
2028
+ },
2029
+ "128253": {
2030
+ "content": "<|reserved_special_token_248|>",
2031
+ "lstrip": false,
2032
+ "normalized": false,
2033
+ "rstrip": false,
2034
+ "single_word": false,
2035
+ "special": true
2036
+ },
2037
+ "128254": {
2038
+ "content": "<|reserved_special_token_249|>",
2039
+ "lstrip": false,
2040
+ "normalized": false,
2041
+ "rstrip": false,
2042
+ "single_word": false,
2043
+ "special": true
2044
+ },
2045
+ "128255": {
2046
+ "content": "<|reserved_special_token_250|>",
2047
+ "lstrip": false,
2048
+ "normalized": false,
2049
+ "rstrip": false,
2050
+ "single_word": false,
2051
+ "special": true
2052
+ }
2053
+ },
2054
+ "bos_token": "<|bos|>",
2055
+ "chat_template": "{{ bos_token }}{% for message in messages %}\n{% set role = message['role'] %}\n{% if role == 'tools' %}\n{% set role = 'tool:schemas' %}\n{% elif role == 'execute' %}\n{% set role = 'tool:execute' %}\n{% elif role == 'response' %}\n{% set role = 'tool:results' %}\n{% endif %}\n{% set content = message['content'] | trim + '<|cos|>' %}\n{% if role == 'system' %}\n{% set content = '<|role:begin|>system<|role:end|>\n' + content %}\n{% elif role == 'user' %}\n{% set content = '<|role:begin|>user<|role:end|>\n' + content %}\n{% elif role == 'assistant' %}\n{% set content = '<|role:begin|>assistant<|role:end|>\n' + content %}\n{% elif role == 'tool:schemas' %}\n{% set content = '<|role:begin|>tools<|role:end|>\n' + content %}\n{% elif role == 'tool:execute' %}\n{% set content = '<|role:begin|>assistant<|role:end|>\n<|tool:execute|>' + content %}\n{% elif role == 'tool:results' %}\n{% set content = '<|role:begin|>user<|role:end|>\n<|tool:results|>' + content %}\n{% endif %}\n{{ content }}\n{% if loop.last and add_generation_prompt %}\n{{ '<|role:begin|>assistant<|role:end|>' }}\n{% endif %}\n{% endfor %}",
2056
+ "clean_up_tokenization_spaces": true,
2057
+ "eos_token": "<|eos|>",
2058
+ "model_input_names": [
2059
+ "input_ids",
2060
+ "attention_mask"
2061
+ ],
2062
+ "model_max_length": 1000000000000000019884624838656,
2063
+ "pad_token": "<|pad|>",
2064
+ "tokenizer_class": "PreTrainedTokenizerFast",
2065
+ "unk_token": "<|unk|>"
2066
+ }