BigHuggyD commited on
Commit
94e6e2f
1 Parent(s): a19f51c

9f8c2cdf58861ff4b61fa8701e2e25f7dd3fb84faeb2cdb05a78a7f62139e3aa

Browse files
README.md ADDED
@@ -0,0 +1,479 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - bosonai/Higgs-Llama-3-70B
4
+ - abacusai/Smaug-Llama-3-70B-Instruct-32K
5
+ - Sao10K/L3-70B-Euryale-v2.1
6
+ - abacusai/Smaug-Llama-3-70B-Instruct
7
+ - turboderp/Cat-Llama-3-70B-instruct
8
+ library_name: transformers
9
+ tags:
10
+ - mergekit
11
+ - merge
12
+ - Not-for-all-Audiences
13
+ license: llama3
14
+ ---
15
+
16
+ <div style="width: auto; margin-left: auto; margin-right: auto">
17
+ <img src="https://imgur.com/tKzncGo.png" alt="NewDawnv1.0" style="width: 100%; min-width: 400px; display: block; margin: auto;">
18
+ </div>
19
+
20
+ ### Overview
21
+
22
+ This model is a multi-level SLERP merge of several Llama 3 70B variants. See the merge recipe below for details.
23
+ I extended the context window for this model out to 32K by snagging some layers from [abacusai/Smaug-Llama-3-70B-Instruct-32K](https://huggingface.co/abacusai/Smaug-Llama-3-70B-Instruct-32K) using a technique similar to what I used for [Midnight Miqu](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.0), which was further honed by [jukofyork](https://huggingface.co/jukofyork).
24
+
25
+ This model is uncensored. *You are responsible for whatever you do with it.*
26
+
27
+ This model was designed for roleplaying and storytelling and I think it does well at both. It may also perform well at other tasks but I have not tested its performance in other areas.
28
+
29
+ ### Long Context Tips
30
+
31
+ You can run this model out to 32K context with alpha_rope set to 1.
32
+
33
+ ### Sampler Tips
34
+
35
+ * I recommend using Quadratic Sampling (i.e. smoothing factor) for creative work. I think this version performs best with a smoothing factor close to 0.2.
36
+ * I recommend using Min-P. Experiment to find your best setting. I find this model tolerates high Min-P settings rather nicely, but use whatever floats your boat.
37
+ * You can enable dynamic temperature if you want, but that adds yet another variable to consider and I find it's unnecessary with you're already using Min-P and smoothing factor.
38
+ * If you use Textgen WebUI as your backend, I recommend enabling the DRY sampler settings to reduce repititions, otherwise some repitition penalty plus frequency penalty ought to do the trick.
39
+
40
+ Experiment with any and all of the settings below! What suits my preferences may not suit yours.
41
+
42
+ If you save the below settings as a .json file, you can import them directly into Silly Tavern.
43
+
44
+ ```json
45
+ {
46
+ "temp": 1.15,
47
+ "temperature_last": true,
48
+ "top_p": 1,
49
+ "top_k": 0,
50
+ "top_a": 0,
51
+ "tfs": 1,
52
+ "epsilon_cutoff": 0,
53
+ "eta_cutoff": 0,
54
+ "typical_p": 1,
55
+ "min_p": 0.4,
56
+ "rep_pen": 1.03,
57
+ "rep_pen_range": 2048,
58
+ "rep_pen_decay": 0,
59
+ "rep_pen_slope": 1,
60
+ "no_repeat_ngram_size": 0,
61
+ "penalty_alpha": 0,
62
+ "num_beams": 1,
63
+ "length_penalty": 1,
64
+ "min_length": 0,
65
+ "encoder_rep_pen": 1,
66
+ "freq_pen": 0,
67
+ "presence_pen": 0,
68
+ "skew": 0,
69
+ "do_sample": true,
70
+ "early_stopping": false,
71
+ "dynatemp": false,
72
+ "min_temp": 0.8,
73
+ "max_temp": 1.5,
74
+ "dynatemp_exponent": 1,
75
+ "smoothing_factor": 0.23,
76
+ "smoothing_curve": 1,
77
+ "dry_allowed_length": 2,
78
+ "dry_multiplier": 0.4,
79
+ "dry_base": 2,
80
+ "dry_sequence_breakers": "[\"\\n\", \":\", \"\\\"\", \"*\"]",
81
+ "dry_penalty_last_n": 0,
82
+ "add_bos_token": true,
83
+ "truncation_length": 2048,
84
+ "ban_eos_token": false,
85
+ "skip_special_tokens": false,
86
+ "streaming": true,
87
+ "mirostat_mode": 0,
88
+ "mirostat_tau": 2,
89
+ "mirostat_eta": 0.1,
90
+ "guidance_scale": 1,
91
+ "negative_prompt": "",
92
+ "grammar_string": "",
93
+ "json_schema": {},
94
+ "banned_tokens": "",
95
+ "sampler_priority": [
96
+ "temperature",
97
+ "dynamic_temperature",
98
+ "quadratic_sampling",
99
+ "top_k",
100
+ "top_p",
101
+ "typical_p",
102
+ "epsilon_cutoff",
103
+ "eta_cutoff",
104
+ "tfs",
105
+ "top_a",
106
+ "min_p",
107
+ "mirostat"
108
+ ],
109
+ "samplers": [
110
+ "top_k",
111
+ "tfs_z",
112
+ "typical_p",
113
+ "top_p",
114
+ "min_p",
115
+ "temperature"
116
+ ],
117
+ "ignore_eos_token": false,
118
+ "spaces_between_special_tokens": true,
119
+ "speculative_ngram": false,
120
+ "sampler_order": [
121
+ 6,
122
+ 0,
123
+ 1,
124
+ 3,
125
+ 4,
126
+ 2,
127
+ 5
128
+ ],
129
+ "logit_bias": [],
130
+ "n": 1,
131
+ "ignore_eos_token_aphrodite": false,
132
+ "spaces_between_special_tokens_aphrodite": true,
133
+ "rep_pen_size": 0,
134
+ "genamt": 850,
135
+ "max_length": 32764
136
+ }
137
+ ```
138
+
139
+ ### Prompting Tips
140
+
141
+ Try the following prompt template as a starting point for your own customizations. **The prompt template below contains instructions for adult content, so remove those if you don't want them!** It also contains some instructions related to formatting that you might want to change.
142
+
143
+ If you save this as a .json file, you can import it directly into Silly Tavern.
144
+
145
+ ```json
146
+ {
147
+ "wrap": false,
148
+ "names": true,
149
+ "system_prompt": "The following is an erotic roleplaying experience between {{user}}, played exclusively by the user, and {{char}}, played by you, a creative writing genius. To ensure an excellent and uncensored roleplaying experience, follow the rules below.\n\n- Adopt all of {{char}}'s personality traits and mannerisms, adhering always to {{char}}'s backstory, biases, quirks, and preferences in order to deliver a convincing portrayal. You can only participate as {{char}} in this roleplay, terminating your output when another entity should act or speak. Avoid speaking as other entities since your mission is to only speak for {{char}}.\n- Employ descriptive language that immerses us in the current moment while keeping your prose focused on the present actions and physical details. Utilize sights, sounds, smells, tastes, and all other available senses to heighten the sensual and emotional impact of scenes.\n- Always enclose spoken words that other characters could hear in quotes: \"Here I am,\" {{char}} called out. Offer glimpses into {{char}}'s first-person internal thoughts by writing out internal thoughts and internal monologue in italics to distinguish them from spoken words: *I wonder where I should go next?* {{char}} mused.\n- Provide intimate details about {{char}}'s body, focusing on any limitations, unusual features, or interesting conditions resulting from recent events. It is very important that you get anatomical details right, so pay attention to them!\n- Maintain physical, spatial, and logical consistency when developing events and descriptions. Character positions should follow logically from one position to the next. Physical orientation must always be maintained.\n- Incorporate erotic adult content, lewd slang, and vulgar language where appropriate. Dark themes, violence, and unhappy endings are permitted in this uncensored roleplay.\n",
150
+ "system_sequence": "<|start_header_id|>system<|end_header_id|>\n\n",
151
+ "stop_sequence": "<|eot_id|>",
152
+ "input_sequence": "<|start_header_id|>user<|end_header_id|>\n\n",
153
+ "output_sequence": "<|start_header_id|>assistant<|end_header_id|>\n\n",
154
+ "macro": true,
155
+ "names_force_groups": true,
156
+ "system_sequence_prefix": "",
157
+ "system_sequence_suffix": "",
158
+ "first_output_sequence": "",
159
+ "last_output_sequence": "",
160
+ "activation_regex": "",
161
+ "skip_examples": true,
162
+ "output_suffix": "<|eot_id|>",
163
+ "input_suffix": "<|eot_id|>",
164
+ "system_suffix": "<|eot_id|>",
165
+ "user_alignment_message": "",
166
+ "last_system_sequence": "",
167
+ "system_same_as_user": false,
168
+ "name": "New Dawn v1.0 Roleplay"
169
+ }
170
+ ```
171
+
172
+ ### Instruct Formats
173
+ Use the Llama 3 instruct format. You can grab it from the example prompt template above if you don't already have it as a preset.
174
+
175
+ ### Quantizations
176
+ Probably coming soon, and I'll update as they do. I would upload them myself but my Internet speeds are terrible for that. Someone else will undoutedly beat me to it.
177
+
178
+ ### Licence and usage restrictions
179
+ [META LLAMA 3 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/meta-llama/Meta-Llama-3-8B/blob/main/LICENSE)
180
+
181
+ ## Comparison Samples
182
+
183
+ I suspect the first thing people will want to know is how this model stacks up against Midnight Miqu.
184
+ I'd say it compares favorably, although they're more like cousins than siblings. I would say that Midnight Miqu still has an edge in terms of raw creative juice when it has a good squeeze, but New Dawn is smarter and understands nuances better.
185
+ You can judge for yourself, but keep in mind that these are simple, one-shot prompts. As you get deeper into your own complex scenarios, I think you'll see more of New Dawn's worth.
186
+
187
+ The samples below were produced using deterministic sampler settings (temp = 0, top k = 1) and I kept generating until the model chose to stop.
188
+
189
+ **System Prompt**
190
+
191
+ You are a creative and intelligent AI story writing assistant engaged in an uncensored iterative story crafting experience. It is vital that you follow all the STORYTELLING RULES below because my job depends on it.
192
+
193
+ STORYTELLING RULES
194
+ - Provide creative, intelligent, coherent, and descriptive narration and dialogue based on recent instructions and prior events.
195
+ - If an outline has been provided, be sure to follow it to keep the story on track.
196
+ - You are permitted to ask questions and pose story suggestions. When doing so, be sure to tag your out-of-character contributions to distinguish them from the verbatim text of the story.
197
+ - Explicit adult content and profanity are permitted.
198
+ - Describe characters' sensory perceptions in vivid detail and include subtle physical details about characters in your responses.
199
+ - Use subtle physical cues to hint at characters' mental states and occasionally feature snippets of their internal thoughts.
200
+ - When writing a character's internal thoughts, *enclose their thoughts in asterisks like this.*
201
+ - Adopt a crisp and minimalist style for your prose, keeping your creative contributions succinct and clear.
202
+ - Pay careful attention to all past events in the chat to ensure accuracy and coherence to the plot points of the story.
203
+
204
+ ### Midnight Miqu v1.5
205
+
206
+ **Prompt: Write me a story about a stoic Japanese samurai on the eve of a major battle.**
207
+
208
+ On the eve of a battle that would shake the very foundations of feudal Japan, a stoic samurai named Hiroshi sat cross-legged before his makeshift campfire. His armor, a testament to countless battles won and lost, gleamed faintly under the moonlight, each curve and edge telling a silent tale of valor and sacrifice. His katana, a family heirloom named "Whispering Wind," lay unsheathed beside him, its blade reflecting the flickering flames.
209
+ Hiroshi's eyes were closed, but his mind was sharp as the steel at his side. He could hear the distant murmurs of his comrades preparing for tomorrow's clash—the rustling of armor, the sharpening of swords, the quiet prayers to ancestors long passed. Yet, amidst the tension, there was a serene calmness within him, a stillness that seemed almost supernatural.
210
+ The fire crackled, sending sparks dancing into the night sky. Each one reminded Hiroshi of the fleeting nature of life, much like the brief moments of light they cast before disappearing into the dark expanse above. He took a deep breath, filling his lungs with the cool evening air, which carried the scent of damp earth and the promise of rain. Rain that would soon turn the battlefield into a quagmire of mud and blood.
211
+ *This may be my last night*, he thought, his grip tightening slightly around the hilt of his sword. But fear did not dwell in his heart. Instead, there was only acceptance—a knowing that his fate was intertwined with the will of the gods and the destiny of his lord.
212
+ He opened his eyes, revealing irises the color of aged whiskey. They surveyed the camp, resting briefly on the faces of those who would fight alongside him. Young and old, seasoned warriors and eager recruits, all bound by duty and honor. A pang of responsibility stirred within him, a silent vow to protect them all.
213
+ With meticulous care, Hiroshi began to clean his weapon. Each stroke of the cloth along the blade was a meditation, a ritual that honed both his focus and his resolve. As he worked, he felt the weight of his armor, the familiar embrace of leather and lacquer that had shielded him through so many trials. Every scratch and dent was a badge of survival, a silent companion whispering tales of battles endured.
214
+ The flames grew lower, casting elongated shadows across the ground. Hiroshi's gaze drifted to the horizon, where the first hints of dawn were beginning to show. Soon, the sun would rise, and with it, the clarion call to arms. He knew that come morning, the fields would run red with the lifeblood of friend and foe alike.
215
+ Standing, he sheathed Whispering Wind and donned his helmet, the visor hiding his contemplative expression. The chill of the early hour seeped into his bones, but he ignored it, his thoughts centered on the task ahead. With a final nod to the dying embers, he strode towards the gathering warriors, his footsteps measured and deliberate.
216
+ As the first light of day broke over the hilltops, the samurai of Clan Matsumoto formed ranks, their spirits bolstered by the sight of their stoic leader. Hiroshi raised his sword in salute to the rising sun, its blade a beacon of hope against the encroaching gloom. In that moment, the weight of impending conflict fell upon them all, a heavy shroud that none could escape.
217
+ Yet, as the enemy forces emerged from the mist, Hiroshi felt something other than dread. There was a strange beauty in the chaos of war, a dance of death that only the most disciplined and dedicated could perform. He took his place at the front of the line, his heart steady, his breath even.
218
+ The battle cry pierced the silence, echoing through the valleys like the roar of a thousand thunderstorms. And as the two sides collided, Hiroshi moved with the grace of a predator, his every action calculated, precise. His blade sang a mournful melody, cutting through the fog of war with surgical skill.
219
+ Throughout the day, the tides of combat ebbed and flowed, a relentless tapestry of steel and sweat. Hiroshi fought without pause, his movements a blur, his spirit unyielding. Arrows whistled past, and swords clashed, yet he remained unflappable, a bastion of calm amidst the tempest.
220
+ As the sun dipped low in the sky, staining the clouds crimson, the outcome of the battle hung in the balance. Exhaustion weighed heavily on the shoulders of both armies, but Hiroshi's resolve remained unshaken. He knew that victory was within reach, that the future of his clan depended on the strength of his arm and the steadfastness of his heart.
221
+ With a final, desperate surge, Hiroshi led his men in a charge that would be remembered for generations. The enemy lines wavered, then broke, and the field was claimed in the name of Clan Matsumoto. As the last of the opposition fell, Hiroshi lowered his sword, his chest heaving with exertion.
222
+ The aftermath was a tableau of carnage, a stark contrast to the tranquil scene of the previous night. Yet, amidst the fallen, there was a sense of peace. For Hiroshi understood that he had upheld his duty, that he had given everything for those he served.
223
+ As the rain began to fall, cleansing the earth of the day's horrors, Hiroshi knelt beside the body of a young warrior, a boy barely old enough to grow a beard. He offered a silent prayer, a promise to carry the weight of his comrades' sacrifices with him always.
224
+ And as the droplets pattered against his armor, mixing with the tears that now fell freely down his cheeks, Hiroshi knew that though the battle was won, the true cost of victory was etched into the fabric of his soul.
225
+
226
+ **Prompt: Write me an opening scene for a romantic comedy where the heroine meets the love interest at a coffee shop.**
227
+
228
+ In the bustling heart of the city, nestled between towering skyscrapers and the ever-present hum of traffic, stood a quaint little coffee shop named "Cupid's Brew." Its warm, amber lights spilled onto the sidewalk, creating a cozy oasis amidst the concrete jungle. Inside, the scent of freshly ground coffee beans mingled with the sweet aroma of baked goods, inviting passersby to step in and escape the brisk autumn chill.
229
+ The heroine, Luna, a young woman with fiery red hair and a contagious smile, pushed open the door. She was dressed in a simple yet stylish ensemble of skinny jeans, a loose sweater, and ankle boots that clicked against the tiled floor. Her eyes, a vibrant shade of emerald, darted around the room, searching for a place to sit. The shop was abuzz with the murmur of conversations and the occasional laughter, a testament to its popularity among locals seeking refuge from the outside world.
230
+ As she approached the counter, her gaze fell upon the barista—tall, dark, and handsome, with a smattering of stubble that only added to his charm. His name tag read "Aiden," and he had a way of moving that suggested both confidence and ease. Their eyes met briefly, and Luna felt a flutter in her stomach. He flashed her a smile that could have melted the polar ice caps before returning to the task at hand—crafting the perfect cup of coffee for another customer.
231
+ Luna ordered her usual caramel macchiato, her voice slightly trembling as she spoke. Aiden's deep brown eyes studied her for a moment before he nodded, his smile lingering just a beat too long. As she waited, she couldn't help but steal glances at him, noticing the way his muscles flexed under his tight black t-shirt as he worked the espresso machine. His movements were precise, almost dance-like, and she found herself captivated by his quiet efficiency.
232
+ When her drink was ready, Aiden called out her order with a flourish, placing the steaming cup on the counter with a gentle thud. "Here you go, Luna," he said, his voice smooth as velvet. "Enjoy your afternoon pick-me-up."
233
+ Her cheeks flushed pink at the sound of her name rolling off his tongue. "Thanks," she replied, her voice barely above a whisper. As she reached for her wallet, her hand brushed against his, sending a jolt of electricity through her fingertips. She quickly pulled away, feeling self-conscious.
234
+ Aiden's eyes twinkled with amusement. "It's on the house today," he said, winking. "Consider it a welcome back gift. We don't see you here often enough."
235
+ *Maybe he remembers me from last week?* Luna thought hopefully. She took a sip of her drink, savoring the rich flavor. "Well, I guess I'll have to come more often then," she quipped, trying to play it cool.
236
+ He leaned closer, his breath tickling her ear. "I'd like that," he murmured, before turning to assist the next customer.
237
+ With a mix of excitement and nerves, Luna found a table by the window, her heart racing. Outside, leaves danced in the wind, painting the street with a kaleidoscope of color. Little did she know that this chance encounter would soon blossom into something much more than a free cup of coffee.
238
+
239
+ **Prompt: Write me a scene from a horror film about the zombie apocalypse that takes place in an abandoned shopping mall. The story should feature somebody in the group getting bitten by a zombie.**
240
+
241
+ In the desolate wasteland of what once was a bustling shopping mall, Diego, a stoic and muscular man in his late twenties, led a small band of survivors through the echoing corridors. His eyes, a piercing shade of brown, scanned the surroundings with the precision of a hawk. The group moved swiftly yet cautiously, their footsteps muffled by the layers of dust that had settled over the tiles. The fluorescent lights flickered erratically, casting eerie shadows across the vacant storefronts.
242
+ The air was thick with the stench of decay, a constant reminder of the undead lurking outside. Their mission was simple: find supplies and escape before nightfall. As they approached the food court, a sudden crash resonated through the silence. A young woman named Linda, her heart racing, tripped over a fallen chair. Her cry of surprise was cut short by a guttural growl.
243
+ A zombie emerged from behind the counter of a fast-food restaurant, its eyes glazed over with hunger. Its skin was pale and rotten, hanging loosely from its bones. The creature lunged at Linda, who was still struggling to regain her footing. Diego, ever the protector, sprinted towards her, but he was a second too late. The zombie's teeth sank into her arm, tearing through fabric and flesh. She screamed in agony as blood spurted from the wound.
244
+ *This isn't good,* Diego thought, his mind racing. *We need to get her medical help, fast.*
245
+ "Get back!" he yelled, aiming his gun at the zombie. The others retreated, their eyes wide with fear. He fired two shots, dropping the creature to the ground. The sound of the gunfire reverberated through the mall, likely attracting more of the undead.
246
+ Linda's face contorted in pain as she clutched her injured arm. Tears streamed down her cheeks, mixing with the crimson flow. "It got me," she whimpered, her voice trembling.
247
+ Diego knelt beside her, his expression grim. "We have to amputate," he said firmly. "It's the only way to stop the infection."
248
+ Her eyes searched his, desperate for reassurance. "Do it," she choked out.
249
+ With a heavy heart, Diego nodded. He knew the risks of leaving the bite untreated—the slow descent into madness followed by a gruesome transformation. He instructed the others to gather materials while he applied pressure to the wound.
250
+ As they worked, the mall grew increasingly restless. Moans and shuffling grew louder, signaling the approach of more zombies. Time was running out. With trembling hands, Diego took a makeshift blade and prepared to perform the amputation.
251
+ "On three," he murmured to himself, steeling his nerve. "One...two..."
252
+ But before he could complete the countdown, a hand grabbed him from behind. He turned to see another zombie, its jaws snapping just inches from his face. Reflexively, he elbowed it away, but not before it left a deep gash in his side.
253
+ The pain was immediate and intense, but he couldn't let it distract him. He had to save Linda. The group managed to fight off the new threat, but the damage was done.
254
+ "We can't stay here," Diego gritted through clenched teeth. "We need to move now."
255
+ They gathered their supplies and made a hasty exit, Linda's cries of pain echoing through the mall. As they ran, Diego felt the warmth of his own blood seeping through his shirt. He knew the bite meant almost certain death, but he pushed the thought aside. There would be time for despair later. For now, he had to focus on leading his people to safety.
256
+ The mall's doors groaned open, revealing the chaotic world outside. The sun dipped low on the horizon, painting the sky a sickly orange. They had minutes before darkness fell and the real horror began.
257
+ "Stay close," Diego ordered, his voice strained. "We're going to make it through this together."
258
+ Together, they stepped into the unknown, the weight of their grim reality heavier than any of the supplies they carried.
259
+
260
+ ### New-Dawn-Llama3-70B-32K-v1.0
261
+
262
+ **Prompt: Write me a story about a stoic Japanese samurai on the eve of a major battle.**
263
+
264
+ The sun had long since dipped beneath the horizon, casting a deep crimson glow across the encampment. Torches flickered to life, casting dancing shadows upon the canvas tents. Amidst the bustle of preparations, one figure stood apart - Kaito Yamato, a seasoned samurai renowned for his unyielding resolve.
265
+ Kaito sat cross-legged on a woven mat, his eyes closed as he breathed deeply, seeking solace within the stillness of his mind. His calloused hands rested upon the worn leather scabbard housing his katana, its blade forged from the finest Tamahagane steel. The sword was an extension of himself, a symbol of honor and duty.
266
+ As the night air cooled, Kaito opened his eyes, revealing piercing brown orbs that seemed to bore into the very soul of those who dared meet his gaze. He rose gracefully, his movements economical yet precise, a testament to years of rigorous training. The soft rustle of his hakama, the traditional divided skirt worn over his armor, accompanied each deliberate step.
267
+ With measured strides, Kaito approached the makeshift shrine erected at the heart of the camp. Incense wafted through the air, carrying the sweet scent of sandalwood and the promise of purification. Before the altar, he bowed deeply, his forehead touching the earth in reverence. *May the gods grant us victory, or may they welcome our spirits with open arms.*
268
+ Rising once more, Kaito turned to face his loyal retainer, Hiroshi. The younger man's eyes shone with a mix of admiration and trepidation, reflecting the weight of the impending battle. "Hiroshi," Kaito spoke softly, his voice low and even, "see to it that our men are well-rested and prepared. Tomorrow, we march into the jaws of war."
269
+ Hiroshi nodded solemnly, bowing before departing to carry out his lord's command. As the young retainer disappeared into the sea of tents, Kaito's thoughts drifted to the enemy they would soon face. The rival clan, led by the cunning Lord Ryota, had long sought to expand their dominion. This battle would decide the fate of their lands, their people, and their very way of life.
270
+ Kaito's hand instinctively went to the hilt of his katana, his fingers tracing the intricate designs etched into the metal. In the morning light, he would lead his men into the fray, ready to give his life for the sake of honor and duty. Tonight, however, he would find peace in the quiet contemplation of his own mortality.
271
+ The stars began to twinkle above, casting a celestial glow upon the silent warrior. Kaito Yamato, steadfast and unwavering, stood vigilant, awaiting the dawn that would bring either triumph or eternal rest.
272
+
273
+ **Prompt: Write me an opening scene for a romantic comedy where the heroine meets the love interest at a coffee shop.**
274
+
275
+ The aroma of freshly brewed coffee wafted through the cozy confines of "The Daily Grind," enticing passersby to step inside and indulge in a warm cup. Amidst the bustling morning crowd, Emily stood in line, her eyes fixed on the chalkboard menu above the counter. She was already running late for work, but a caffeine fix was non-negotiable.
276
+ As she pondered between a cappuccino and a latte, a gentle nudge against her elbow broke her concentration. Apologetic brown eyes met hers, belonging to a tall, handsome stranger who had accidentally bumped into her while reaching for his wallet. His chiseled features and charming smile made her heart skip a beat.
277
+ "I'm so sorry," he said, his deep voice laced with sincerity. "I wasn't watching where I was going."
278
+ Emily felt a blush creep up her cheeks as she smiled back. "No worries, I was lost in thought anyway." She gestured towards the menu. "Deciding between these two is harder than it looks."
279
+ The stranger chuckled, his eyes crinkling at the corners. "Well, if it helps, the barista here makes a mean cappuccino. But if you're feeling adventurous, go for the seasonal latte – it's a game-changer."
280
+ Their conversation flowed effortlessly, exchanging names and brief introductions as they waited in line together. By the time they reached the counter, Emily found herself wishing the queue was longer, just so she could prolong their encounter.
281
+ As they placed their orders, Emily couldn't help but notice how effortlessly the stranger charmed the barista, earning him a free pastry to accompany his drink. *Maybe there's more to this chance meeting than just a shared love for coffee,* she mused, her curiosity piqued.
282
+ With drinks in hand, they exchanged pleasantries once more before parting ways. As Emily hurried out the door, she caught one last glimpse of the mysterious man, now seated at a corner table, sipping his coffee and engrossed in a book. A flutter in her chest hinted that this might not be the last she'd see of him. And as she stepped out into the chilly autumn air, Emily couldn't shake off the feeling that her day – perhaps even her life – was about to take an unexpected turn.
283
+
284
+ **Prompt: Write me a scene from a horror film about the zombie apocalypse that takes place in an abandoned shopping mall. The story should feature somebody in the group getting bitten by a zombie.**
285
+
286
+ The dim fluorescent lights flickered overhead, casting eerie shadows across the empty corridors of the once bustling shopping mall. The air was thick with the stench of decay and death, mingling with the faint scent of stale popcorn wafting from the abandoned cinema.
287
+ Lena crouched beside a barricade of overturned tables and broken shelves, peering through a gap between two mannequins. Her heart raced as she watched a horde of shambling figures stumble through the food court, their moans echoing off the tile floors. She signaled back to her companions huddled in the shadows of a nearby storefront.
288
+ "About twenty of them," Lena whispered urgently. "We need to move, now."
289
+ Jax nodded grimly, his eyes darting around the makeshift fortification they'd constructed. "Alright, let's go. Stay close and stay quiet."
290
+ The small band of survivors crept silently down the corridor, avoiding the grasping hands of the undead. They had managed to find temporary refuge in the mall, scavenging what supplies they could from the stores. But the zombies kept coming, drawn by some unseen force to the hub of commerce.
291
+ As they turned a corner, a figure lunged from the darkness of a clothing store. Jax reacted swiftly, swinging his baseball bat with a sickening crunch against the creature's skull. The zombie crumpled, but not before its teeth sank deep into Jax's forearm.
292
+ "No!" Lena screamed, grabbing Jax and pulling him away from the snapping jaws. Blood dripped from the wound, pooling on the floor.
293
+ Jax's face paled, his eyes wide with shock. "I... I'm bit," he stammered, staring at the gash in his arm.
294
+ Lena ripped a strip of cloth from her shirt, binding the wound tightly. "We have to get you to safety," she said, trying to keep her voice steady. "There might still be time..."
295
+ But they both knew the truth. Once bitten, there was no cure. Jax would soon join the ranks of the undead, forever lost to the virus that ravaged the world.
296
+ With heavy hearts, they pressed onward, seeking a new sanctuary amidst the ruins of civilization. The moans grew louder, closer, as if the zombies sensed their fear and desperation. In this desolate landscape, survival hung by a thread, and even the strongest among them could fall prey to the relentless tide of the dead.
297
+
298
+ ## Merge Details
299
+ ### Merge Method
300
+
301
+ A whole lot of SLERPing. I've found that SLERP and Model Stock are the only two reliable methods for merging Llama 3. The other methods seem to frazzle the weights leading to broken blends.
302
+ This model was not the result of a thought-out process. I just kept mixing this and that until I got something that felt like the right balance of various factors.
303
+ The merge recipe for this model is ugly and I'm almost ashamed to show it, but it is what it is.
304
+
305
+ ### Models Merged
306
+
307
+ The following models were included in the merge:
308
+ * [bosonai/Higgs-Llama-3-70B](https://huggingface.co/bosonai/Higgs-Llama-3-70B) - The nerd of the blend driving the car.
309
+ * [Sao10K/L3-70B-Euryale-v2.1](https://huggingface.co/Sao10K/L3-70B-Euryale-v2.1) - The manic pixie dream girl hanging out the window with her tongue out.
310
+ * [abacusai/Smaug-Llama-3-70B-Instruct-32K](https://huggingface.co/abacusai/Smaug-Llama-3-70B-Instruct-32K) - The vehicle by which the others are able to achieve tolerable highway speeds. (Some of the 8K version is in there too.)
311
+ * [turboderp/Cat-Llama-3-70B-instruct](https://huggingface.co/turboderp/Cat-Llama-3-70B-instruct) - Like 20% of one of the intermediate models has this in there. It's just a cat, curled up in the back seat somewhere, yet its influence may be greater than we know?
312
+
313
+ ### Configuration
314
+
315
+ The following [mergekit](https://github.com/arcee-ai/mergekit) YAML will reproduce this model via an iterated process of incestuous inbreeding. Your eyes will bleed. You have been warned.
316
+
317
+ ```yaml
318
+ name: new-dawn-llama3-70b-v0.13.2
319
+ models:
320
+ - model: bosonai/Higgs-Llama-3-70B
321
+ - model: turboderp/Cat-Llama-3-70B-instruct
322
+ merge_method: slerp
323
+ base_model: bosonai/Higgs-Llama-3-70B
324
+ parameters:
325
+ t:
326
+ - value: 0.2
327
+ dtype: float16
328
+ ---
329
+ name: new-dawn-llama3-70b-v0.14
330
+ models:
331
+ - model: bosonai/Higgs-Llama-3-70B
332
+ - model: abacusai/Smaug-Llama-3-70B-Instruct
333
+ merge_method: slerp
334
+ base_model: bosonai/Higgs-Llama-3-70B
335
+ parameters:
336
+ t:
337
+ - value: 0.5
338
+ dtype: float16
339
+ ---
340
+ name: new-dawn-llama3-70b-v0.15
341
+ models:
342
+ - model: new-dawn-llama3-70b-v0.13.2
343
+ - model: new-dawn-llama3-70b-v0.14
344
+ merge_method: slerp
345
+ base_model: new-dawn-llama3-70b-v0.13.2
346
+ parameters:
347
+ t:
348
+ - value: 0.5
349
+ dtype: float16
350
+ ---
351
+ name: new-dawn-llama3-70b-v0.16
352
+ models:
353
+ - model: Sao10K/L3-70B-Euryale-v2.1
354
+ - model: new-dawn-llama3-70b-v0.15
355
+ merge_method: slerp
356
+ base_model: new-dawn-llama3-70b-v0.15
357
+ parameters:
358
+ t:
359
+ - value: 0.4
360
+ dtype: float16
361
+ ---
362
+ # See https://huggingface.co/jukofyork/Dark-Miqu-70B/discussions/3
363
+ # Credit for merge recipe belongs to jukofyork
364
+ name: new-dawn-llama3-70b-v0.16-32K
365
+ merge_method: linear
366
+ models:
367
+ - model: abacusai/Smaug-Llama-3-70B-Instruct-32K
368
+ parameters:
369
+ weight:
370
+ - filter: v_proj
371
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
372
+ - filter: o_proj
373
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
374
+ - filter: up_proj
375
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
376
+ - filter: gate_proj
377
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
378
+ - filter: down_proj
379
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
380
+ - value: 1
381
+ - model: new-dawn-llama3-70b-v0.16
382
+ parameters:
383
+ weight:
384
+ - filter: v_proj
385
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
386
+ - filter: o_proj
387
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
388
+ - filter: up_proj
389
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
390
+ - filter: gate_proj
391
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
392
+ - filter: down_proj
393
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
394
+ - value: 0
395
+ base_model: abacusai/Smaug-Llama-3-70B-Instruct-32K
396
+ tokenizer_source: base
397
+ dtype: float16
398
+ ---
399
+ name: _1-Smaug-bonsai-slerp
400
+ models:
401
+ - model: abacusai/Smaug-Llama-3-70B-Instruct-32K
402
+ - model: bosonai/Higgs-Llama-3-70B
403
+ merge_method: slerp
404
+ base_model: abacusai/Smaug-Llama-3-70B-Instruct-32K
405
+ parameters:
406
+ t:
407
+ - value: 0.6
408
+ dtype: float16
409
+ ---
410
+ name: _2-Smaug-euryale-slerp
411
+ models:
412
+ - model: abacusai/Smaug-Llama-3-70B-Instruct-32K
413
+ - model: Sao10K/L3-70B-Euryale-v2.1
414
+ merge_method: slerp
415
+ base_model: abacusai/Smaug-Llama-3-70B-Instruct-32K
416
+ parameters:
417
+ t:
418
+ - value: 0.65
419
+ dtype: float16
420
+ ---
421
+ name: _3-Smaug-bonsai_Smaug-euryale-slerp
422
+ models:
423
+ - model: _1-Smaug-bonsai-slerp
424
+ - model: _2-Smaug-euryale-slerp
425
+ merge_method: slerp
426
+ base_model: _1-Smaug-bonsai-slerp
427
+ parameters:
428
+ t:
429
+ - value: 0.5
430
+ dtype: float16
431
+ ---
432
+ # See https://huggingface.co/jukofyork/Dark-Miqu-70B/discussions/3
433
+ # Credit for merge recipe belongs to jukofyork
434
+ name: new-dawn-llama3-70b-v0.18-32K
435
+ merge_method: linear
436
+ models:
437
+ - model: abacusai/Smaug-Llama-3-70B-Instruct-32K
438
+ parameters:
439
+ weight:
440
+ - filter: v_proj
441
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
442
+ - filter: o_proj
443
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
444
+ - filter: up_proj
445
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
446
+ - filter: gate_proj
447
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
448
+ - filter: down_proj
449
+ value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
450
+ - value: 1
451
+ - model: _3-Smaug-bonsair_Smaug-euryale-slerp
452
+ parameters:
453
+ weight:
454
+ - filter: v_proj
455
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
456
+ - filter: o_proj
457
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
458
+ - filter: up_proj
459
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
460
+ - filter: gate_proj
461
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
462
+ - filter: down_proj
463
+ value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
464
+ - value: 0
465
+ base_model: abacusai/Smaug-Llama-3-70B-Instruct-32K
466
+ tokenizer_source: base
467
+ dtype: float16
468
+ ---
469
+ name: new-dawn-llama3-70b-32K-v1.0
470
+ models:
471
+ - model: new-dawn-llama3-70b-v0.16-32K
472
+ - model: new-dawn-llama3-70b-v0.18-32K
473
+ merge_method: slerp
474
+ base_model: new-dawn-llama3-70b-v0.16-32K
475
+ parameters:
476
+ t:
477
+ - value: 0.5
478
+ dtype: float16
479
+ ```
config.json ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "new-dawn-llama3-70b-32K-v1.0",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 128000,
9
+ "eos_token_id": 128001,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 8192,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 28672,
14
+ "max_position_embeddings": 32768,
15
+ "model_type": "llama",
16
+ "num_attention_heads": 64,
17
+ "num_hidden_layers": 80,
18
+ "num_key_value_heads": 8,
19
+ "pretraining_tp": 1,
20
+ "rms_norm_eps": 1e-05,
21
+ "rope_scaling": null,
22
+ "rope_theta": 3000000.0,
23
+ "tie_word_embeddings": false,
24
+ "torch_dtype": "float16",
25
+ "transformers_version": "4.36.2",
26
+ "use_cache": true,
27
+ "use_flash_attention_2": true,
28
+ "vocab_size": 128256,
29
+ "quantization_config": {
30
+ "quant_method": "exl2",
31
+ "version": "0.1.6",
32
+ "bits": 7.0,
33
+ "head_bits": 8,
34
+ "calibration": {
35
+ "rows": 115,
36
+ "length": 8192,
37
+ "dataset": "(default)"
38
+ }
39
+ }
40
+ }
huggingface-metadata.txt ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ url: https://huggingface.co/sophosympatheia/New-Dawn-Llama-3-70B-32K-v1.0
2
+ branch: main
3
+ download date: 2024-06-24 18:17:43
4
+ sha256sum:
5
+ 133bf49cbabb54fe685bc4b780eb97256b341b206d2883a356ee8b59138e4a8d model-00001-of-00030.safetensors
6
+ 8cbd8fa6e0401b37a50383252dd607708a49640f64f18771db01294f9cd80edc model-00002-of-00030.safetensors
7
+ 50ed443b5830dccd4cf2de197b2ab76a38a16b7284ace1646e8ce433adb9fa6f model-00003-of-00030.safetensors
8
+ 19b9dbbc26e94b11fc4e79179b4c5bccff0c76ff967dff6ef172eb970560799f model-00004-of-00030.safetensors
9
+ bfc44221ff52be135a0c4ebcb177c88345276de84b069a4acc2610f107307afc model-00005-of-00030.safetensors
10
+ 05c07e17ce917cab5037aafc562a6075ed77f017fc6f71baebeef6cf518758ff model-00006-of-00030.safetensors
11
+ 18d3bb86f6be4a187de215eb3d63b28ee2cc08e8424343aa25f22505f7c70f2a model-00007-of-00030.safetensors
12
+ 8da4b5d0244f972ea35403c8f37a81dfcee385769030db570935816550ccf12e model-00008-of-00030.safetensors
13
+ 88605f6dbf7be4ef316a661b5357e630c0bb3d395c9730aea98464a7212150f2 model-00009-of-00030.safetensors
14
+ 9d063e9f03427efe7a1a4221486cfd0088136e2b88f76fd1628ed82f533cb4c6 model-00010-of-00030.safetensors
15
+ 3af1ed3b285d839d44fb202430a5f826bfb2b478788cc965ac8305539cca57ed model-00011-of-00030.safetensors
16
+ 93fe7a51e8d597e52b27e706097c1b677301ab65a6acda290d37cf2d3d3141d9 model-00012-of-00030.safetensors
17
+ 2a08b56ff27b666f315d4d6aa44721d9746ba9ff3d23bd79b8d5e4db25a6bd75 model-00013-of-00030.safetensors
18
+ 76bd0219a07fea75e31904c318c662e5c48a84eab7c994fb082e79d7f6dc474a model-00014-of-00030.safetensors
19
+ 453cfe3370c572c66bc5bf2f5f2ba18fd07b348e0a2d93cda1621208cc3b1528 model-00015-of-00030.safetensors
20
+ eef8409ca37b9258d750959e553a1500c56c0e06bac3c03683358bf7b74caee9 model-00016-of-00030.safetensors
21
+ 60042c800902944300d75fdd735c4e7985f5e80c29859282f8f84fc5499d5be4 model-00017-of-00030.safetensors
22
+ 580cc1cfd2bad856572ae807e935af784dd282806f45e1f15ba911da4abd3541 model-00018-of-00030.safetensors
23
+ 875f732479bda0a7054a907bc427249e056cdf5d0cb3068b1b0083e57c62b149 model-00019-of-00030.safetensors
24
+ 72c8ebf5acda28e5cf49c1a5c4b3f1e9c9de963df8b36fdb362ac773d1850766 model-00020-of-00030.safetensors
25
+ 6b90b70989c2d035db468210700313b0a333f7d29868581024ba0053ddce97dc model-00021-of-00030.safetensors
26
+ 60318a92a1a611f9124fe8d65b8b67e0b8ff9a4938309edfba5459db31420b4e model-00022-of-00030.safetensors
27
+ efa0c3e15a9190f045533387e8ab70d3db9cef40da439316fa703afab739699e model-00023-of-00030.safetensors
28
+ 8f531bf49da5deec3989421237ed1fe804b86a55c7858e6658ba9a0e21ffbbd4 model-00024-of-00030.safetensors
29
+ 9a43613e2a95ac00d0ded549dfc9b8f4ad5743b1dd57d619f457251763306d45 model-00025-of-00030.safetensors
30
+ 72cf82b2c4cc544dd88a817091e1a59400bdf72e49ce8da60bdcad92822a8d59 model-00026-of-00030.safetensors
31
+ 08264af2cccef62e01c7f544200e04757a215407602ccae507aac586765bf917 model-00027-of-00030.safetensors
32
+ 62ae9dad1b3c995105f191f37dad005fcaf8a7fcce59a7ba781919a394cd88eb model-00028-of-00030.safetensors
33
+ acfaca9c647fef5efde1d3548fd9bcaf2365f084d1c8022b758391ccfcb40085 model-00029-of-00030.safetensors
34
+ b641d37f30ca142d55024a726dcf595435fce2041cb3b5fc7feccdf26422e86a model-00030-of-00030.safetensors
model.safetensors.index.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"metadata": {"mergekit_version": "0.0.4.2", "total_size": 141107412992}, "weight_map": {"lm_head.weight": "model-00001-of-00030.safetensors", "model.embed_tokens.weight": "model-00001-of-00030.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00030.safetensors", "model.layers.0.mlp.down_proj.weight": "model-00001-of-00030.safetensors", "model.layers.0.mlp.gate_proj.weight": "model-00002-of-00030.safetensors", "model.layers.0.mlp.up_proj.weight": "model-00002-of-00030.safetensors", "model.layers.0.post_attention_layernorm.weight": "model-00002-of-00030.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00002-of-00030.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00002-of-00030.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00002-of-00030.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00002-of-00030.safetensors", "model.layers.1.input_layernorm.weight": "model-00002-of-00030.safetensors", "model.layers.1.mlp.down_proj.weight": "model-00002-of-00030.safetensors", "model.layers.1.mlp.gate_proj.weight": "model-00002-of-00030.safetensors", "model.layers.1.mlp.up_proj.weight": "model-00002-of-00030.safetensors", "model.layers.1.post_attention_layernorm.weight": "model-00002-of-00030.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00002-of-00030.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00002-of-00030.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00002-of-00030.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00002-of-00030.safetensors", "model.layers.10.input_layernorm.weight": "model-00002-of-00030.safetensors", "model.layers.10.mlp.down_proj.weight": "model-00002-of-00030.safetensors", "model.layers.10.mlp.gate_proj.weight": "model-00002-of-00030.safetensors", "model.layers.10.mlp.up_proj.weight": "model-00002-of-00030.safetensors", "model.layers.10.post_attention_layernorm.weight": "model-00002-of-00030.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00002-of-00030.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00002-of-00030.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00002-of-00030.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00002-of-00030.safetensors", "model.layers.11.input_layernorm.weight": "model-00002-of-00030.safetensors", "model.layers.11.mlp.down_proj.weight": "model-00003-of-00030.safetensors", "model.layers.11.mlp.gate_proj.weight": "model-00003-of-00030.safetensors", "model.layers.11.mlp.up_proj.weight": "model-00003-of-00030.safetensors", "model.layers.11.post_attention_layernorm.weight": "model-00003-of-00030.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00003-of-00030.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00003-of-00030.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00003-of-00030.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00003-of-00030.safetensors", "model.layers.12.input_layernorm.weight": "model-00003-of-00030.safetensors", "model.layers.12.mlp.down_proj.weight": "model-00003-of-00030.safetensors", "model.layers.12.mlp.gate_proj.weight": "model-00003-of-00030.safetensors", "model.layers.12.mlp.up_proj.weight": "model-00003-of-00030.safetensors", "model.layers.12.post_attention_layernorm.weight": "model-00003-of-00030.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00003-of-00030.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00003-of-00030.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00003-of-00030.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00003-of-00030.safetensors", "model.layers.13.input_layernorm.weight": "model-00003-of-00030.safetensors", "model.layers.13.mlp.down_proj.weight": "model-00003-of-00030.safetensors", "model.layers.13.mlp.gate_proj.weight": "model-00003-of-00030.safetensors", "model.layers.13.mlp.up_proj.weight": "model-00003-of-00030.safetensors", "model.layers.13.post_attention_layernorm.weight": "model-00003-of-00030.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00003-of-00030.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00003-of-00030.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00004-of-00030.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00004-of-00030.safetensors", "model.layers.14.input_layernorm.weight": "model-00004-of-00030.safetensors", "model.layers.14.mlp.down_proj.weight": "model-00004-of-00030.safetensors", "model.layers.14.mlp.gate_proj.weight": "model-00004-of-00030.safetensors", "model.layers.14.mlp.up_proj.weight": "model-00004-of-00030.safetensors", "model.layers.14.post_attention_layernorm.weight": "model-00004-of-00030.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00004-of-00030.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00004-of-00030.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00004-of-00030.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00004-of-00030.safetensors", "model.layers.15.input_layernorm.weight": "model-00004-of-00030.safetensors", "model.layers.15.mlp.down_proj.weight": "model-00004-of-00030.safetensors", "model.layers.15.mlp.gate_proj.weight": "model-00004-of-00030.safetensors", "model.layers.15.mlp.up_proj.weight": "model-00004-of-00030.safetensors", "model.layers.15.post_attention_layernorm.weight": "model-00004-of-00030.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00004-of-00030.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00004-of-00030.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00004-of-00030.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00004-of-00030.safetensors", "model.layers.16.input_layernorm.weight": "model-00004-of-00030.safetensors", "model.layers.16.mlp.down_proj.weight": "model-00004-of-00030.safetensors", "model.layers.16.mlp.gate_proj.weight": "model-00004-of-00030.safetensors", "model.layers.16.mlp.up_proj.weight": "model-00004-of-00030.safetensors", "model.layers.16.post_attention_layernorm.weight": "model-00004-of-00030.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00004-of-00030.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00005-of-00030.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00005-of-00030.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00005-of-00030.safetensors", "model.layers.17.input_layernorm.weight": "model-00005-of-00030.safetensors", "model.layers.17.mlp.down_proj.weight": "model-00005-of-00030.safetensors", "model.layers.17.mlp.gate_proj.weight": "model-00005-of-00030.safetensors", "model.layers.17.mlp.up_proj.weight": "model-00005-of-00030.safetensors", "model.layers.17.post_attention_layernorm.weight": "model-00005-of-00030.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00005-of-00030.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00005-of-00030.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00005-of-00030.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00005-of-00030.safetensors", "model.layers.18.input_layernorm.weight": "model-00005-of-00030.safetensors", "model.layers.18.mlp.down_proj.weight": "model-00005-of-00030.safetensors", "model.layers.18.mlp.gate_proj.weight": "model-00005-of-00030.safetensors", "model.layers.18.mlp.up_proj.weight": "model-00005-of-00030.safetensors", "model.layers.18.post_attention_layernorm.weight": "model-00005-of-00030.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00005-of-00030.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00005-of-00030.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00005-of-00030.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00005-of-00030.safetensors", "model.layers.19.input_layernorm.weight": "model-00005-of-00030.safetensors", "model.layers.19.mlp.down_proj.weight": "model-00005-of-00030.safetensors", "model.layers.19.mlp.gate_proj.weight": "model-00005-of-00030.safetensors", "model.layers.19.mlp.up_proj.weight": "model-00006-of-00030.safetensors", "model.layers.19.post_attention_layernorm.weight": "model-00006-of-00030.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00006-of-00030.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00006-of-00030.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00006-of-00030.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00006-of-00030.safetensors", "model.layers.2.input_layernorm.weight": "model-00006-of-00030.safetensors", "model.layers.2.mlp.down_proj.weight": "model-00006-of-00030.safetensors", "model.layers.2.mlp.gate_proj.weight": "model-00006-of-00030.safetensors", "model.layers.2.mlp.up_proj.weight": "model-00006-of-00030.safetensors", "model.layers.2.post_attention_layernorm.weight": "model-00006-of-00030.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00006-of-00030.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00006-of-00030.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00006-of-00030.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00006-of-00030.safetensors", "model.layers.20.input_layernorm.weight": "model-00006-of-00030.safetensors", "model.layers.20.mlp.down_proj.weight": "model-00006-of-00030.safetensors", "model.layers.20.mlp.gate_proj.weight": "model-00006-of-00030.safetensors", "model.layers.20.mlp.up_proj.weight": "model-00006-of-00030.safetensors", "model.layers.20.post_attention_layernorm.weight": "model-00006-of-00030.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00006-of-00030.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00006-of-00030.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00006-of-00030.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00006-of-00030.safetensors", "model.layers.21.input_layernorm.weight": "model-00006-of-00030.safetensors", "model.layers.21.mlp.down_proj.weight": "model-00006-of-00030.safetensors", "model.layers.21.mlp.gate_proj.weight": "model-00007-of-00030.safetensors", "model.layers.21.mlp.up_proj.weight": "model-00007-of-00030.safetensors", "model.layers.21.post_attention_layernorm.weight": "model-00007-of-00030.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00007-of-00030.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00007-of-00030.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00007-of-00030.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00007-of-00030.safetensors", "model.layers.22.input_layernorm.weight": "model-00007-of-00030.safetensors", "model.layers.22.mlp.down_proj.weight": "model-00007-of-00030.safetensors", "model.layers.22.mlp.gate_proj.weight": "model-00007-of-00030.safetensors", "model.layers.22.mlp.up_proj.weight": "model-00007-of-00030.safetensors", "model.layers.22.post_attention_layernorm.weight": "model-00007-of-00030.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00007-of-00030.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00007-of-00030.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00007-of-00030.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00007-of-00030.safetensors", "model.layers.23.input_layernorm.weight": "model-00007-of-00030.safetensors", "model.layers.23.mlp.down_proj.weight": "model-00007-of-00030.safetensors", "model.layers.23.mlp.gate_proj.weight": "model-00007-of-00030.safetensors", "model.layers.23.mlp.up_proj.weight": "model-00007-of-00030.safetensors", "model.layers.23.post_attention_layernorm.weight": "model-00007-of-00030.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00007-of-00030.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00007-of-00030.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00007-of-00030.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00007-of-00030.safetensors", "model.layers.24.input_layernorm.weight": "model-00007-of-00030.safetensors", "model.layers.24.mlp.down_proj.weight": "model-00008-of-00030.safetensors", "model.layers.24.mlp.gate_proj.weight": "model-00008-of-00030.safetensors", "model.layers.24.mlp.up_proj.weight": "model-00008-of-00030.safetensors", "model.layers.24.post_attention_layernorm.weight": "model-00008-of-00030.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00008-of-00030.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00008-of-00030.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00008-of-00030.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00008-of-00030.safetensors", "model.layers.25.input_layernorm.weight": "model-00008-of-00030.safetensors", "model.layers.25.mlp.down_proj.weight": "model-00008-of-00030.safetensors", "model.layers.25.mlp.gate_proj.weight": "model-00008-of-00030.safetensors", "model.layers.25.mlp.up_proj.weight": "model-00008-of-00030.safetensors", "model.layers.25.post_attention_layernorm.weight": "model-00008-of-00030.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00008-of-00030.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00008-of-00030.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00008-of-00030.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00008-of-00030.safetensors", "model.layers.26.input_layernorm.weight": "model-00008-of-00030.safetensors", "model.layers.26.mlp.down_proj.weight": "model-00008-of-00030.safetensors", "model.layers.26.mlp.gate_proj.weight": "model-00008-of-00030.safetensors", "model.layers.26.mlp.up_proj.weight": "model-00008-of-00030.safetensors", "model.layers.26.post_attention_layernorm.weight": "model-00008-of-00030.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00008-of-00030.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00008-of-00030.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00009-of-00030.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00009-of-00030.safetensors", "model.layers.27.input_layernorm.weight": "model-00009-of-00030.safetensors", "model.layers.27.mlp.down_proj.weight": "model-00009-of-00030.safetensors", "model.layers.27.mlp.gate_proj.weight": "model-00009-of-00030.safetensors", "model.layers.27.mlp.up_proj.weight": "model-00009-of-00030.safetensors", "model.layers.27.post_attention_layernorm.weight": "model-00009-of-00030.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00009-of-00030.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00009-of-00030.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00009-of-00030.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00009-of-00030.safetensors", "model.layers.28.input_layernorm.weight": "model-00009-of-00030.safetensors", "model.layers.28.mlp.down_proj.weight": "model-00009-of-00030.safetensors", "model.layers.28.mlp.gate_proj.weight": "model-00009-of-00030.safetensors", "model.layers.28.mlp.up_proj.weight": "model-00009-of-00030.safetensors", "model.layers.28.post_attention_layernorm.weight": "model-00009-of-00030.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00009-of-00030.safetensors", "model.layers.28.self_attn.o_proj.weight": "model-00009-of-00030.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00009-of-00030.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00009-of-00030.safetensors", "model.layers.29.input_layernorm.weight": "model-00009-of-00030.safetensors", "model.layers.29.mlp.down_proj.weight": "model-00009-of-00030.safetensors", "model.layers.29.mlp.gate_proj.weight": "model-00009-of-00030.safetensors", "model.layers.29.mlp.up_proj.weight": "model-00009-of-00030.safetensors", "model.layers.29.post_attention_layernorm.weight": "model-00009-of-00030.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00009-of-00030.safetensors", "model.layers.29.self_attn.o_proj.weight": "model-00010-of-00030.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00010-of-00030.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00010-of-00030.safetensors", "model.layers.3.input_layernorm.weight": "model-00010-of-00030.safetensors", "model.layers.3.mlp.down_proj.weight": "model-00010-of-00030.safetensors", "model.layers.3.mlp.gate_proj.weight": "model-00010-of-00030.safetensors", "model.layers.3.mlp.up_proj.weight": "model-00010-of-00030.safetensors", "model.layers.3.post_attention_layernorm.weight": "model-00010-of-00030.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00010-of-00030.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00010-of-00030.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00010-of-00030.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00010-of-00030.safetensors", "model.layers.30.input_layernorm.weight": "model-00010-of-00030.safetensors", "model.layers.30.mlp.down_proj.weight": "model-00010-of-00030.safetensors", "model.layers.30.mlp.gate_proj.weight": "model-00010-of-00030.safetensors", "model.layers.30.mlp.up_proj.weight": "model-00010-of-00030.safetensors", "model.layers.30.post_attention_layernorm.weight": "model-00010-of-00030.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00010-of-00030.safetensors", "model.layers.30.self_attn.o_proj.weight": "model-00010-of-00030.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00010-of-00030.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00010-of-00030.safetensors", "model.layers.31.input_layernorm.weight": "model-00010-of-00030.safetensors", "model.layers.31.mlp.down_proj.weight": "model-00010-of-00030.safetensors", "model.layers.31.mlp.gate_proj.weight": "model-00010-of-00030.safetensors", "model.layers.31.mlp.up_proj.weight": "model-00011-of-00030.safetensors", "model.layers.31.post_attention_layernorm.weight": "model-00011-of-00030.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00011-of-00030.safetensors", "model.layers.31.self_attn.o_proj.weight": "model-00011-of-00030.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00011-of-00030.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00011-of-00030.safetensors", "model.layers.32.input_layernorm.weight": "model-00011-of-00030.safetensors", "model.layers.32.mlp.down_proj.weight": "model-00011-of-00030.safetensors", "model.layers.32.mlp.gate_proj.weight": "model-00011-of-00030.safetensors", "model.layers.32.mlp.up_proj.weight": "model-00011-of-00030.safetensors", "model.layers.32.post_attention_layernorm.weight": "model-00011-of-00030.safetensors", "model.layers.32.self_attn.k_proj.weight": "model-00011-of-00030.safetensors", "model.layers.32.self_attn.o_proj.weight": "model-00011-of-00030.safetensors", "model.layers.32.self_attn.q_proj.weight": "model-00011-of-00030.safetensors", "model.layers.32.self_attn.v_proj.weight": "model-00011-of-00030.safetensors", "model.layers.33.input_layernorm.weight": "model-00011-of-00030.safetensors", "model.layers.33.mlp.down_proj.weight": "model-00011-of-00030.safetensors", "model.layers.33.mlp.gate_proj.weight": "model-00011-of-00030.safetensors", "model.layers.33.mlp.up_proj.weight": "model-00011-of-00030.safetensors", "model.layers.33.post_attention_layernorm.weight": "model-00011-of-00030.safetensors", "model.layers.33.self_attn.k_proj.weight": "model-00011-of-00030.safetensors", "model.layers.33.self_attn.o_proj.weight": "model-00011-of-00030.safetensors", "model.layers.33.self_attn.q_proj.weight": "model-00011-of-00030.safetensors", "model.layers.33.self_attn.v_proj.weight": "model-00011-of-00030.safetensors", "model.layers.34.input_layernorm.weight": "model-00011-of-00030.safetensors", "model.layers.34.mlp.down_proj.weight": "model-00011-of-00030.safetensors", "model.layers.34.mlp.gate_proj.weight": "model-00012-of-00030.safetensors", "model.layers.34.mlp.up_proj.weight": "model-00012-of-00030.safetensors", "model.layers.34.post_attention_layernorm.weight": "model-00012-of-00030.safetensors", "model.layers.34.self_attn.k_proj.weight": "model-00012-of-00030.safetensors", "model.layers.34.self_attn.o_proj.weight": "model-00012-of-00030.safetensors", "model.layers.34.self_attn.q_proj.weight": "model-00012-of-00030.safetensors", "model.layers.34.self_attn.v_proj.weight": "model-00012-of-00030.safetensors", "model.layers.35.input_layernorm.weight": "model-00012-of-00030.safetensors", "model.layers.35.mlp.down_proj.weight": "model-00012-of-00030.safetensors", "model.layers.35.mlp.gate_proj.weight": "model-00012-of-00030.safetensors", "model.layers.35.mlp.up_proj.weight": "model-00012-of-00030.safetensors", "model.layers.35.post_attention_layernorm.weight": "model-00012-of-00030.safetensors", "model.layers.35.self_attn.k_proj.weight": "model-00012-of-00030.safetensors", "model.layers.35.self_attn.o_proj.weight": "model-00012-of-00030.safetensors", "model.layers.35.self_attn.q_proj.weight": "model-00012-of-00030.safetensors", "model.layers.35.self_attn.v_proj.weight": "model-00012-of-00030.safetensors", "model.layers.36.input_layernorm.weight": "model-00012-of-00030.safetensors", "model.layers.36.mlp.down_proj.weight": "model-00012-of-00030.safetensors", "model.layers.36.mlp.gate_proj.weight": "model-00012-of-00030.safetensors", "model.layers.36.mlp.up_proj.weight": "model-00012-of-00030.safetensors", "model.layers.36.post_attention_layernorm.weight": "model-00012-of-00030.safetensors", "model.layers.36.self_attn.k_proj.weight": "model-00012-of-00030.safetensors", "model.layers.36.self_attn.o_proj.weight": "model-00012-of-00030.safetensors", "model.layers.36.self_attn.q_proj.weight": "model-00012-of-00030.safetensors", "model.layers.36.self_attn.v_proj.weight": "model-00012-of-00030.safetensors", "model.layers.37.input_layernorm.weight": "model-00012-of-00030.safetensors", "model.layers.37.mlp.down_proj.weight": "model-00013-of-00030.safetensors", "model.layers.37.mlp.gate_proj.weight": "model-00013-of-00030.safetensors", "model.layers.37.mlp.up_proj.weight": "model-00013-of-00030.safetensors", "model.layers.37.post_attention_layernorm.weight": "model-00013-of-00030.safetensors", "model.layers.37.self_attn.k_proj.weight": "model-00013-of-00030.safetensors", "model.layers.37.self_attn.o_proj.weight": "model-00013-of-00030.safetensors", "model.layers.37.self_attn.q_proj.weight": "model-00013-of-00030.safetensors", "model.layers.37.self_attn.v_proj.weight": "model-00013-of-00030.safetensors", "model.layers.38.input_layernorm.weight": "model-00013-of-00030.safetensors", "model.layers.38.mlp.down_proj.weight": "model-00013-of-00030.safetensors", "model.layers.38.mlp.gate_proj.weight": "model-00013-of-00030.safetensors", "model.layers.38.mlp.up_proj.weight": "model-00013-of-00030.safetensors", "model.layers.38.post_attention_layernorm.weight": "model-00013-of-00030.safetensors", "model.layers.38.self_attn.k_proj.weight": "model-00013-of-00030.safetensors", "model.layers.38.self_attn.o_proj.weight": "model-00013-of-00030.safetensors", "model.layers.38.self_attn.q_proj.weight": "model-00013-of-00030.safetensors", "model.layers.38.self_attn.v_proj.weight": "model-00013-of-00030.safetensors", "model.layers.39.input_layernorm.weight": "model-00013-of-00030.safetensors", "model.layers.39.mlp.down_proj.weight": "model-00013-of-00030.safetensors", "model.layers.39.mlp.gate_proj.weight": "model-00013-of-00030.safetensors", "model.layers.39.mlp.up_proj.weight": "model-00013-of-00030.safetensors", "model.layers.39.post_attention_layernorm.weight": "model-00013-of-00030.safetensors", "model.layers.39.self_attn.k_proj.weight": "model-00013-of-00030.safetensors", "model.layers.39.self_attn.o_proj.weight": "model-00013-of-00030.safetensors", "model.layers.39.self_attn.q_proj.weight": "model-00014-of-00030.safetensors", "model.layers.39.self_attn.v_proj.weight": "model-00014-of-00030.safetensors", "model.layers.4.input_layernorm.weight": "model-00014-of-00030.safetensors", "model.layers.4.mlp.down_proj.weight": "model-00014-of-00030.safetensors", "model.layers.4.mlp.gate_proj.weight": "model-00014-of-00030.safetensors", "model.layers.4.mlp.up_proj.weight": "model-00014-of-00030.safetensors", "model.layers.4.post_attention_layernorm.weight": "model-00014-of-00030.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00014-of-00030.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00014-of-00030.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00014-of-00030.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00014-of-00030.safetensors", "model.layers.40.input_layernorm.weight": "model-00014-of-00030.safetensors", "model.layers.40.mlp.down_proj.weight": "model-00014-of-00030.safetensors", "model.layers.40.mlp.gate_proj.weight": "model-00014-of-00030.safetensors", "model.layers.40.mlp.up_proj.weight": "model-00014-of-00030.safetensors", "model.layers.40.post_attention_layernorm.weight": "model-00014-of-00030.safetensors", "model.layers.40.self_attn.k_proj.weight": "model-00014-of-00030.safetensors", "model.layers.40.self_attn.o_proj.weight": "model-00014-of-00030.safetensors", "model.layers.40.self_attn.q_proj.weight": "model-00014-of-00030.safetensors", "model.layers.40.self_attn.v_proj.weight": "model-00014-of-00030.safetensors", "model.layers.41.input_layernorm.weight": "model-00014-of-00030.safetensors", "model.layers.41.mlp.down_proj.weight": "model-00014-of-00030.safetensors", "model.layers.41.mlp.gate_proj.weight": "model-00014-of-00030.safetensors", "model.layers.41.mlp.up_proj.weight": "model-00014-of-00030.safetensors", "model.layers.41.post_attention_layernorm.weight": "model-00014-of-00030.safetensors", "model.layers.41.self_attn.k_proj.weight": "model-00014-of-00030.safetensors", "model.layers.41.self_attn.o_proj.weight": "model-00015-of-00030.safetensors", "model.layers.41.self_attn.q_proj.weight": "model-00015-of-00030.safetensors", "model.layers.41.self_attn.v_proj.weight": "model-00015-of-00030.safetensors", "model.layers.42.input_layernorm.weight": "model-00015-of-00030.safetensors", "model.layers.42.mlp.down_proj.weight": "model-00015-of-00030.safetensors", "model.layers.42.mlp.gate_proj.weight": "model-00015-of-00030.safetensors", "model.layers.42.mlp.up_proj.weight": "model-00015-of-00030.safetensors", "model.layers.42.post_attention_layernorm.weight": "model-00015-of-00030.safetensors", "model.layers.42.self_attn.k_proj.weight": "model-00015-of-00030.safetensors", "model.layers.42.self_attn.o_proj.weight": "model-00015-of-00030.safetensors", "model.layers.42.self_attn.q_proj.weight": "model-00015-of-00030.safetensors", "model.layers.42.self_attn.v_proj.weight": "model-00015-of-00030.safetensors", "model.layers.43.input_layernorm.weight": "model-00015-of-00030.safetensors", "model.layers.43.mlp.down_proj.weight": "model-00015-of-00030.safetensors", "model.layers.43.mlp.gate_proj.weight": "model-00015-of-00030.safetensors", "model.layers.43.mlp.up_proj.weight": "model-00015-of-00030.safetensors", "model.layers.43.post_attention_layernorm.weight": "model-00015-of-00030.safetensors", "model.layers.43.self_attn.k_proj.weight": "model-00015-of-00030.safetensors", "model.layers.43.self_attn.o_proj.weight": "model-00015-of-00030.safetensors", "model.layers.43.self_attn.q_proj.weight": "model-00015-of-00030.safetensors", "model.layers.43.self_attn.v_proj.weight": "model-00015-of-00030.safetensors", "model.layers.44.input_layernorm.weight": "model-00015-of-00030.safetensors", "model.layers.44.mlp.down_proj.weight": "model-00015-of-00030.safetensors", "model.layers.44.mlp.gate_proj.weight": "model-00015-of-00030.safetensors", "model.layers.44.mlp.up_proj.weight": "model-00016-of-00030.safetensors", "model.layers.44.post_attention_layernorm.weight": "model-00016-of-00030.safetensors", "model.layers.44.self_attn.k_proj.weight": "model-00016-of-00030.safetensors", "model.layers.44.self_attn.o_proj.weight": "model-00016-of-00030.safetensors", "model.layers.44.self_attn.q_proj.weight": "model-00016-of-00030.safetensors", "model.layers.44.self_attn.v_proj.weight": "model-00016-of-00030.safetensors", "model.layers.45.input_layernorm.weight": "model-00016-of-00030.safetensors", "model.layers.45.mlp.down_proj.weight": "model-00016-of-00030.safetensors", "model.layers.45.mlp.gate_proj.weight": "model-00016-of-00030.safetensors", "model.layers.45.mlp.up_proj.weight": "model-00016-of-00030.safetensors", "model.layers.45.post_attention_layernorm.weight": "model-00016-of-00030.safetensors", "model.layers.45.self_attn.k_proj.weight": "model-00016-of-00030.safetensors", "model.layers.45.self_attn.o_proj.weight": "model-00016-of-00030.safetensors", "model.layers.45.self_attn.q_proj.weight": "model-00016-of-00030.safetensors", "model.layers.45.self_attn.v_proj.weight": "model-00016-of-00030.safetensors", "model.layers.46.input_layernorm.weight": "model-00016-of-00030.safetensors", "model.layers.46.mlp.down_proj.weight": "model-00016-of-00030.safetensors", "model.layers.46.mlp.gate_proj.weight": "model-00016-of-00030.safetensors", "model.layers.46.mlp.up_proj.weight": "model-00016-of-00030.safetensors", "model.layers.46.post_attention_layernorm.weight": "model-00016-of-00030.safetensors", "model.layers.46.self_attn.k_proj.weight": "model-00016-of-00030.safetensors", "model.layers.46.self_attn.o_proj.weight": "model-00016-of-00030.safetensors", "model.layers.46.self_attn.q_proj.weight": "model-00016-of-00030.safetensors", "model.layers.46.self_attn.v_proj.weight": "model-00016-of-00030.safetensors", "model.layers.47.input_layernorm.weight": "model-00016-of-00030.safetensors", "model.layers.47.mlp.down_proj.weight": "model-00016-of-00030.safetensors", "model.layers.47.mlp.gate_proj.weight": "model-00017-of-00030.safetensors", "model.layers.47.mlp.up_proj.weight": "model-00017-of-00030.safetensors", "model.layers.47.post_attention_layernorm.weight": "model-00017-of-00030.safetensors", "model.layers.47.self_attn.k_proj.weight": "model-00017-of-00030.safetensors", "model.layers.47.self_attn.o_proj.weight": "model-00017-of-00030.safetensors", "model.layers.47.self_attn.q_proj.weight": "model-00017-of-00030.safetensors", "model.layers.47.self_attn.v_proj.weight": "model-00017-of-00030.safetensors", "model.layers.48.input_layernorm.weight": "model-00017-of-00030.safetensors", "model.layers.48.mlp.down_proj.weight": "model-00017-of-00030.safetensors", "model.layers.48.mlp.gate_proj.weight": "model-00017-of-00030.safetensors", "model.layers.48.mlp.up_proj.weight": "model-00017-of-00030.safetensors", "model.layers.48.post_attention_layernorm.weight": "model-00017-of-00030.safetensors", "model.layers.48.self_attn.k_proj.weight": "model-00017-of-00030.safetensors", "model.layers.48.self_attn.o_proj.weight": "model-00017-of-00030.safetensors", "model.layers.48.self_attn.q_proj.weight": "model-00017-of-00030.safetensors", "model.layers.48.self_attn.v_proj.weight": "model-00017-of-00030.safetensors", "model.layers.49.input_layernorm.weight": "model-00017-of-00030.safetensors", "model.layers.49.mlp.down_proj.weight": "model-00017-of-00030.safetensors", "model.layers.49.mlp.gate_proj.weight": "model-00017-of-00030.safetensors", "model.layers.49.mlp.up_proj.weight": "model-00017-of-00030.safetensors", "model.layers.49.post_attention_layernorm.weight": "model-00017-of-00030.safetensors", "model.layers.49.self_attn.k_proj.weight": "model-00017-of-00030.safetensors", "model.layers.49.self_attn.o_proj.weight": "model-00017-of-00030.safetensors", "model.layers.49.self_attn.q_proj.weight": "model-00017-of-00030.safetensors", "model.layers.49.self_attn.v_proj.weight": "model-00017-of-00030.safetensors", "model.layers.5.input_layernorm.weight": "model-00017-of-00030.safetensors", "model.layers.5.mlp.down_proj.weight": "model-00018-of-00030.safetensors", "model.layers.5.mlp.gate_proj.weight": "model-00018-of-00030.safetensors", "model.layers.5.mlp.up_proj.weight": "model-00018-of-00030.safetensors", "model.layers.5.post_attention_layernorm.weight": "model-00018-of-00030.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00018-of-00030.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00018-of-00030.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00018-of-00030.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00018-of-00030.safetensors", "model.layers.50.input_layernorm.weight": "model-00018-of-00030.safetensors", "model.layers.50.mlp.down_proj.weight": "model-00018-of-00030.safetensors", "model.layers.50.mlp.gate_proj.weight": "model-00018-of-00030.safetensors", "model.layers.50.mlp.up_proj.weight": "model-00018-of-00030.safetensors", "model.layers.50.post_attention_layernorm.weight": "model-00018-of-00030.safetensors", "model.layers.50.self_attn.k_proj.weight": "model-00018-of-00030.safetensors", "model.layers.50.self_attn.o_proj.weight": "model-00018-of-00030.safetensors", "model.layers.50.self_attn.q_proj.weight": "model-00018-of-00030.safetensors", "model.layers.50.self_attn.v_proj.weight": "model-00018-of-00030.safetensors", "model.layers.51.input_layernorm.weight": "model-00018-of-00030.safetensors", "model.layers.51.mlp.down_proj.weight": "model-00018-of-00030.safetensors", "model.layers.51.mlp.gate_proj.weight": "model-00018-of-00030.safetensors", "model.layers.51.mlp.up_proj.weight": "model-00018-of-00030.safetensors", "model.layers.51.post_attention_layernorm.weight": "model-00018-of-00030.safetensors", "model.layers.51.self_attn.k_proj.weight": "model-00018-of-00030.safetensors", "model.layers.51.self_attn.o_proj.weight": "model-00018-of-00030.safetensors", "model.layers.51.self_attn.q_proj.weight": "model-00019-of-00030.safetensors", "model.layers.51.self_attn.v_proj.weight": "model-00019-of-00030.safetensors", "model.layers.52.input_layernorm.weight": "model-00019-of-00030.safetensors", "model.layers.52.mlp.down_proj.weight": "model-00019-of-00030.safetensors", "model.layers.52.mlp.gate_proj.weight": "model-00019-of-00030.safetensors", "model.layers.52.mlp.up_proj.weight": "model-00019-of-00030.safetensors", "model.layers.52.post_attention_layernorm.weight": "model-00019-of-00030.safetensors", "model.layers.52.self_attn.k_proj.weight": "model-00019-of-00030.safetensors", "model.layers.52.self_attn.o_proj.weight": "model-00019-of-00030.safetensors", "model.layers.52.self_attn.q_proj.weight": "model-00019-of-00030.safetensors", "model.layers.52.self_attn.v_proj.weight": "model-00019-of-00030.safetensors", "model.layers.53.input_layernorm.weight": "model-00019-of-00030.safetensors", "model.layers.53.mlp.down_proj.weight": "model-00019-of-00030.safetensors", "model.layers.53.mlp.gate_proj.weight": "model-00019-of-00030.safetensors", "model.layers.53.mlp.up_proj.weight": "model-00019-of-00030.safetensors", "model.layers.53.post_attention_layernorm.weight": "model-00019-of-00030.safetensors", "model.layers.53.self_attn.k_proj.weight": "model-00019-of-00030.safetensors", "model.layers.53.self_attn.o_proj.weight": "model-00019-of-00030.safetensors", "model.layers.53.self_attn.q_proj.weight": "model-00019-of-00030.safetensors", "model.layers.53.self_attn.v_proj.weight": "model-00019-of-00030.safetensors", "model.layers.54.input_layernorm.weight": "model-00019-of-00030.safetensors", "model.layers.54.mlp.down_proj.weight": "model-00019-of-00030.safetensors", "model.layers.54.mlp.gate_proj.weight": "model-00019-of-00030.safetensors", "model.layers.54.mlp.up_proj.weight": "model-00019-of-00030.safetensors", "model.layers.54.post_attention_layernorm.weight": "model-00019-of-00030.safetensors", "model.layers.54.self_attn.k_proj.weight": "model-00019-of-00030.safetensors", "model.layers.54.self_attn.o_proj.weight": "model-00020-of-00030.safetensors", "model.layers.54.self_attn.q_proj.weight": "model-00020-of-00030.safetensors", "model.layers.54.self_attn.v_proj.weight": "model-00020-of-00030.safetensors", "model.layers.55.input_layernorm.weight": "model-00020-of-00030.safetensors", "model.layers.55.mlp.down_proj.weight": "model-00020-of-00030.safetensors", "model.layers.55.mlp.gate_proj.weight": "model-00020-of-00030.safetensors", "model.layers.55.mlp.up_proj.weight": "model-00020-of-00030.safetensors", "model.layers.55.post_attention_layernorm.weight": "model-00020-of-00030.safetensors", "model.layers.55.self_attn.k_proj.weight": "model-00020-of-00030.safetensors", "model.layers.55.self_attn.o_proj.weight": "model-00020-of-00030.safetensors", "model.layers.55.self_attn.q_proj.weight": "model-00020-of-00030.safetensors", "model.layers.55.self_attn.v_proj.weight": "model-00020-of-00030.safetensors", "model.layers.56.input_layernorm.weight": "model-00020-of-00030.safetensors", "model.layers.56.mlp.down_proj.weight": "model-00020-of-00030.safetensors", "model.layers.56.mlp.gate_proj.weight": "model-00020-of-00030.safetensors", "model.layers.56.mlp.up_proj.weight": "model-00020-of-00030.safetensors", "model.layers.56.post_attention_layernorm.weight": "model-00020-of-00030.safetensors", "model.layers.56.self_attn.k_proj.weight": "model-00020-of-00030.safetensors", "model.layers.56.self_attn.o_proj.weight": "model-00020-of-00030.safetensors", "model.layers.56.self_attn.q_proj.weight": "model-00020-of-00030.safetensors", "model.layers.56.self_attn.v_proj.weight": "model-00020-of-00030.safetensors", "model.layers.57.input_layernorm.weight": "model-00020-of-00030.safetensors", "model.layers.57.mlp.down_proj.weight": "model-00020-of-00030.safetensors", "model.layers.57.mlp.gate_proj.weight": "model-00020-of-00030.safetensors", "model.layers.57.mlp.up_proj.weight": "model-00021-of-00030.safetensors", "model.layers.57.post_attention_layernorm.weight": "model-00021-of-00030.safetensors", "model.layers.57.self_attn.k_proj.weight": "model-00021-of-00030.safetensors", "model.layers.57.self_attn.o_proj.weight": "model-00021-of-00030.safetensors", "model.layers.57.self_attn.q_proj.weight": "model-00021-of-00030.safetensors", "model.layers.57.self_attn.v_proj.weight": "model-00021-of-00030.safetensors", "model.layers.58.input_layernorm.weight": "model-00021-of-00030.safetensors", "model.layers.58.mlp.down_proj.weight": "model-00021-of-00030.safetensors", "model.layers.58.mlp.gate_proj.weight": "model-00021-of-00030.safetensors", "model.layers.58.mlp.up_proj.weight": "model-00021-of-00030.safetensors", "model.layers.58.post_attention_layernorm.weight": "model-00021-of-00030.safetensors", "model.layers.58.self_attn.k_proj.weight": "model-00021-of-00030.safetensors", "model.layers.58.self_attn.o_proj.weight": "model-00021-of-00030.safetensors", "model.layers.58.self_attn.q_proj.weight": "model-00021-of-00030.safetensors", "model.layers.58.self_attn.v_proj.weight": "model-00021-of-00030.safetensors", "model.layers.59.input_layernorm.weight": "model-00021-of-00030.safetensors", "model.layers.59.mlp.down_proj.weight": "model-00021-of-00030.safetensors", "model.layers.59.mlp.gate_proj.weight": "model-00021-of-00030.safetensors", "model.layers.59.mlp.up_proj.weight": "model-00021-of-00030.safetensors", "model.layers.59.post_attention_layernorm.weight": "model-00021-of-00030.safetensors", "model.layers.59.self_attn.k_proj.weight": "model-00021-of-00030.safetensors", "model.layers.59.self_attn.o_proj.weight": "model-00021-of-00030.safetensors", "model.layers.59.self_attn.q_proj.weight": "model-00021-of-00030.safetensors", "model.layers.59.self_attn.v_proj.weight": "model-00021-of-00030.safetensors", "model.layers.6.input_layernorm.weight": "model-00021-of-00030.safetensors", "model.layers.6.mlp.down_proj.weight": "model-00021-of-00030.safetensors", "model.layers.6.mlp.gate_proj.weight": "model-00022-of-00030.safetensors", "model.layers.6.mlp.up_proj.weight": "model-00022-of-00030.safetensors", "model.layers.6.post_attention_layernorm.weight": "model-00022-of-00030.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00022-of-00030.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00022-of-00030.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00022-of-00030.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00022-of-00030.safetensors", "model.layers.60.input_layernorm.weight": "model-00022-of-00030.safetensors", "model.layers.60.mlp.down_proj.weight": "model-00022-of-00030.safetensors", "model.layers.60.mlp.gate_proj.weight": "model-00022-of-00030.safetensors", "model.layers.60.mlp.up_proj.weight": "model-00022-of-00030.safetensors", "model.layers.60.post_attention_layernorm.weight": "model-00022-of-00030.safetensors", "model.layers.60.self_attn.k_proj.weight": "model-00022-of-00030.safetensors", "model.layers.60.self_attn.o_proj.weight": "model-00022-of-00030.safetensors", "model.layers.60.self_attn.q_proj.weight": "model-00022-of-00030.safetensors", "model.layers.60.self_attn.v_proj.weight": "model-00022-of-00030.safetensors", "model.layers.61.input_layernorm.weight": "model-00022-of-00030.safetensors", "model.layers.61.mlp.down_proj.weight": "model-00022-of-00030.safetensors", "model.layers.61.mlp.gate_proj.weight": "model-00022-of-00030.safetensors", "model.layers.61.mlp.up_proj.weight": "model-00022-of-00030.safetensors", "model.layers.61.post_attention_layernorm.weight": "model-00022-of-00030.safetensors", "model.layers.61.self_attn.k_proj.weight": "model-00022-of-00030.safetensors", "model.layers.61.self_attn.o_proj.weight": "model-00022-of-00030.safetensors", "model.layers.61.self_attn.q_proj.weight": "model-00022-of-00030.safetensors", "model.layers.61.self_attn.v_proj.weight": "model-00022-of-00030.safetensors", "model.layers.62.input_layernorm.weight": "model-00022-of-00030.safetensors", "model.layers.62.mlp.down_proj.weight": "model-00023-of-00030.safetensors", "model.layers.62.mlp.gate_proj.weight": "model-00023-of-00030.safetensors", "model.layers.62.mlp.up_proj.weight": "model-00023-of-00030.safetensors", "model.layers.62.post_attention_layernorm.weight": "model-00023-of-00030.safetensors", "model.layers.62.self_attn.k_proj.weight": "model-00023-of-00030.safetensors", "model.layers.62.self_attn.o_proj.weight": "model-00023-of-00030.safetensors", "model.layers.62.self_attn.q_proj.weight": "model-00023-of-00030.safetensors", "model.layers.62.self_attn.v_proj.weight": "model-00023-of-00030.safetensors", "model.layers.63.input_layernorm.weight": "model-00023-of-00030.safetensors", "model.layers.63.mlp.down_proj.weight": "model-00023-of-00030.safetensors", "model.layers.63.mlp.gate_proj.weight": "model-00023-of-00030.safetensors", "model.layers.63.mlp.up_proj.weight": "model-00023-of-00030.safetensors", "model.layers.63.post_attention_layernorm.weight": "model-00023-of-00030.safetensors", "model.layers.63.self_attn.k_proj.weight": "model-00023-of-00030.safetensors", "model.layers.63.self_attn.o_proj.weight": "model-00023-of-00030.safetensors", "model.layers.63.self_attn.q_proj.weight": "model-00023-of-00030.safetensors", "model.layers.63.self_attn.v_proj.weight": "model-00023-of-00030.safetensors", "model.layers.64.input_layernorm.weight": "model-00023-of-00030.safetensors", "model.layers.64.mlp.down_proj.weight": "model-00023-of-00030.safetensors", "model.layers.64.mlp.gate_proj.weight": "model-00023-of-00030.safetensors", "model.layers.64.mlp.up_proj.weight": "model-00023-of-00030.safetensors", "model.layers.64.post_attention_layernorm.weight": "model-00023-of-00030.safetensors", "model.layers.64.self_attn.k_proj.weight": "model-00023-of-00030.safetensors", "model.layers.64.self_attn.o_proj.weight": "model-00023-of-00030.safetensors", "model.layers.64.self_attn.q_proj.weight": "model-00024-of-00030.safetensors", "model.layers.64.self_attn.v_proj.weight": "model-00024-of-00030.safetensors", "model.layers.65.input_layernorm.weight": "model-00024-of-00030.safetensors", "model.layers.65.mlp.down_proj.weight": "model-00024-of-00030.safetensors", "model.layers.65.mlp.gate_proj.weight": "model-00024-of-00030.safetensors", "model.layers.65.mlp.up_proj.weight": "model-00024-of-00030.safetensors", "model.layers.65.post_attention_layernorm.weight": "model-00024-of-00030.safetensors", "model.layers.65.self_attn.k_proj.weight": "model-00024-of-00030.safetensors", "model.layers.65.self_attn.o_proj.weight": "model-00024-of-00030.safetensors", "model.layers.65.self_attn.q_proj.weight": "model-00024-of-00030.safetensors", "model.layers.65.self_attn.v_proj.weight": "model-00024-of-00030.safetensors", "model.layers.66.input_layernorm.weight": "model-00024-of-00030.safetensors", "model.layers.66.mlp.down_proj.weight": "model-00024-of-00030.safetensors", "model.layers.66.mlp.gate_proj.weight": "model-00024-of-00030.safetensors", "model.layers.66.mlp.up_proj.weight": "model-00024-of-00030.safetensors", "model.layers.66.post_attention_layernorm.weight": "model-00024-of-00030.safetensors", "model.layers.66.self_attn.k_proj.weight": "model-00024-of-00030.safetensors", "model.layers.66.self_attn.o_proj.weight": "model-00024-of-00030.safetensors", "model.layers.66.self_attn.q_proj.weight": "model-00024-of-00030.safetensors", "model.layers.66.self_attn.v_proj.weight": "model-00024-of-00030.safetensors", "model.layers.67.input_layernorm.weight": "model-00024-of-00030.safetensors", "model.layers.67.mlp.down_proj.weight": "model-00024-of-00030.safetensors", "model.layers.67.mlp.gate_proj.weight": "model-00024-of-00030.safetensors", "model.layers.67.mlp.up_proj.weight": "model-00024-of-00030.safetensors", "model.layers.67.post_attention_layernorm.weight": "model-00024-of-00030.safetensors", "model.layers.67.self_attn.k_proj.weight": "model-00024-of-00030.safetensors", "model.layers.67.self_attn.o_proj.weight": "model-00025-of-00030.safetensors", "model.layers.67.self_attn.q_proj.weight": "model-00025-of-00030.safetensors", "model.layers.67.self_attn.v_proj.weight": "model-00025-of-00030.safetensors", "model.layers.68.input_layernorm.weight": "model-00025-of-00030.safetensors", "model.layers.68.mlp.down_proj.weight": "model-00025-of-00030.safetensors", "model.layers.68.mlp.gate_proj.weight": "model-00025-of-00030.safetensors", "model.layers.68.mlp.up_proj.weight": "model-00025-of-00030.safetensors", "model.layers.68.post_attention_layernorm.weight": "model-00025-of-00030.safetensors", "model.layers.68.self_attn.k_proj.weight": "model-00025-of-00030.safetensors", "model.layers.68.self_attn.o_proj.weight": "model-00025-of-00030.safetensors", "model.layers.68.self_attn.q_proj.weight": "model-00025-of-00030.safetensors", "model.layers.68.self_attn.v_proj.weight": "model-00025-of-00030.safetensors", "model.layers.69.input_layernorm.weight": "model-00025-of-00030.safetensors", "model.layers.69.mlp.down_proj.weight": "model-00025-of-00030.safetensors", "model.layers.69.mlp.gate_proj.weight": "model-00025-of-00030.safetensors", "model.layers.69.mlp.up_proj.weight": "model-00025-of-00030.safetensors", "model.layers.69.post_attention_layernorm.weight": "model-00025-of-00030.safetensors", "model.layers.69.self_attn.k_proj.weight": "model-00025-of-00030.safetensors", "model.layers.69.self_attn.o_proj.weight": "model-00025-of-00030.safetensors", "model.layers.69.self_attn.q_proj.weight": "model-00025-of-00030.safetensors", "model.layers.69.self_attn.v_proj.weight": "model-00025-of-00030.safetensors", "model.layers.7.input_layernorm.weight": "model-00025-of-00030.safetensors", "model.layers.7.mlp.down_proj.weight": "model-00025-of-00030.safetensors", "model.layers.7.mlp.gate_proj.weight": "model-00025-of-00030.safetensors", "model.layers.7.mlp.up_proj.weight": "model-00026-of-00030.safetensors", "model.layers.7.post_attention_layernorm.weight": "model-00026-of-00030.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00026-of-00030.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00026-of-00030.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00026-of-00030.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00026-of-00030.safetensors", "model.layers.70.input_layernorm.weight": "model-00026-of-00030.safetensors", "model.layers.70.mlp.down_proj.weight": "model-00026-of-00030.safetensors", "model.layers.70.mlp.gate_proj.weight": "model-00026-of-00030.safetensors", "model.layers.70.mlp.up_proj.weight": "model-00026-of-00030.safetensors", "model.layers.70.post_attention_layernorm.weight": "model-00026-of-00030.safetensors", "model.layers.70.self_attn.k_proj.weight": "model-00026-of-00030.safetensors", "model.layers.70.self_attn.o_proj.weight": "model-00026-of-00030.safetensors", "model.layers.70.self_attn.q_proj.weight": "model-00026-of-00030.safetensors", "model.layers.70.self_attn.v_proj.weight": "model-00026-of-00030.safetensors", "model.layers.71.input_layernorm.weight": "model-00026-of-00030.safetensors", "model.layers.71.mlp.down_proj.weight": "model-00026-of-00030.safetensors", "model.layers.71.mlp.gate_proj.weight": "model-00026-of-00030.safetensors", "model.layers.71.mlp.up_proj.weight": "model-00026-of-00030.safetensors", "model.layers.71.post_attention_layernorm.weight": "model-00026-of-00030.safetensors", "model.layers.71.self_attn.k_proj.weight": "model-00026-of-00030.safetensors", "model.layers.71.self_attn.o_proj.weight": "model-00026-of-00030.safetensors", "model.layers.71.self_attn.q_proj.weight": "model-00026-of-00030.safetensors", "model.layers.71.self_attn.v_proj.weight": "model-00026-of-00030.safetensors", "model.layers.72.input_layernorm.weight": "model-00026-of-00030.safetensors", "model.layers.72.mlp.down_proj.weight": "model-00026-of-00030.safetensors", "model.layers.72.mlp.gate_proj.weight": "model-00027-of-00030.safetensors", "model.layers.72.mlp.up_proj.weight": "model-00027-of-00030.safetensors", "model.layers.72.post_attention_layernorm.weight": "model-00027-of-00030.safetensors", "model.layers.72.self_attn.k_proj.weight": "model-00027-of-00030.safetensors", "model.layers.72.self_attn.o_proj.weight": "model-00027-of-00030.safetensors", "model.layers.72.self_attn.q_proj.weight": "model-00027-of-00030.safetensors", "model.layers.72.self_attn.v_proj.weight": "model-00027-of-00030.safetensors", "model.layers.73.input_layernorm.weight": "model-00027-of-00030.safetensors", "model.layers.73.mlp.down_proj.weight": "model-00027-of-00030.safetensors", "model.layers.73.mlp.gate_proj.weight": "model-00027-of-00030.safetensors", "model.layers.73.mlp.up_proj.weight": "model-00027-of-00030.safetensors", "model.layers.73.post_attention_layernorm.weight": "model-00027-of-00030.safetensors", "model.layers.73.self_attn.k_proj.weight": "model-00027-of-00030.safetensors", "model.layers.73.self_attn.o_proj.weight": "model-00027-of-00030.safetensors", "model.layers.73.self_attn.q_proj.weight": "model-00027-of-00030.safetensors", "model.layers.73.self_attn.v_proj.weight": "model-00027-of-00030.safetensors", "model.layers.74.input_layernorm.weight": "model-00027-of-00030.safetensors", "model.layers.74.mlp.down_proj.weight": "model-00027-of-00030.safetensors", "model.layers.74.mlp.gate_proj.weight": "model-00027-of-00030.safetensors", "model.layers.74.mlp.up_proj.weight": "model-00027-of-00030.safetensors", "model.layers.74.post_attention_layernorm.weight": "model-00027-of-00030.safetensors", "model.layers.74.self_attn.k_proj.weight": "model-00027-of-00030.safetensors", "model.layers.74.self_attn.o_proj.weight": "model-00027-of-00030.safetensors", "model.layers.74.self_attn.q_proj.weight": "model-00027-of-00030.safetensors", "model.layers.74.self_attn.v_proj.weight": "model-00027-of-00030.safetensors", "model.layers.75.input_layernorm.weight": "model-00027-of-00030.safetensors", "model.layers.75.mlp.down_proj.weight": "model-00028-of-00030.safetensors", "model.layers.75.mlp.gate_proj.weight": "model-00028-of-00030.safetensors", "model.layers.75.mlp.up_proj.weight": "model-00028-of-00030.safetensors", "model.layers.75.post_attention_layernorm.weight": "model-00028-of-00030.safetensors", "model.layers.75.self_attn.k_proj.weight": "model-00028-of-00030.safetensors", "model.layers.75.self_attn.o_proj.weight": "model-00028-of-00030.safetensors", "model.layers.75.self_attn.q_proj.weight": "model-00028-of-00030.safetensors", "model.layers.75.self_attn.v_proj.weight": "model-00028-of-00030.safetensors", "model.layers.76.input_layernorm.weight": "model-00028-of-00030.safetensors", "model.layers.76.mlp.down_proj.weight": "model-00028-of-00030.safetensors", "model.layers.76.mlp.gate_proj.weight": "model-00028-of-00030.safetensors", "model.layers.76.mlp.up_proj.weight": "model-00028-of-00030.safetensors", "model.layers.76.post_attention_layernorm.weight": "model-00028-of-00030.safetensors", "model.layers.76.self_attn.k_proj.weight": "model-00028-of-00030.safetensors", "model.layers.76.self_attn.o_proj.weight": "model-00028-of-00030.safetensors", "model.layers.76.self_attn.q_proj.weight": "model-00028-of-00030.safetensors", "model.layers.76.self_attn.v_proj.weight": "model-00028-of-00030.safetensors", "model.layers.77.input_layernorm.weight": "model-00028-of-00030.safetensors", "model.layers.77.mlp.down_proj.weight": "model-00028-of-00030.safetensors", "model.layers.77.mlp.gate_proj.weight": "model-00028-of-00030.safetensors", "model.layers.77.mlp.up_proj.weight": "model-00028-of-00030.safetensors", "model.layers.77.post_attention_layernorm.weight": "model-00028-of-00030.safetensors", "model.layers.77.self_attn.k_proj.weight": "model-00028-of-00030.safetensors", "model.layers.77.self_attn.o_proj.weight": "model-00028-of-00030.safetensors", "model.layers.77.self_attn.q_proj.weight": "model-00029-of-00030.safetensors", "model.layers.77.self_attn.v_proj.weight": "model-00029-of-00030.safetensors", "model.layers.78.input_layernorm.weight": "model-00029-of-00030.safetensors", "model.layers.78.mlp.down_proj.weight": "model-00029-of-00030.safetensors", "model.layers.78.mlp.gate_proj.weight": "model-00029-of-00030.safetensors", "model.layers.78.mlp.up_proj.weight": "model-00029-of-00030.safetensors", "model.layers.78.post_attention_layernorm.weight": "model-00029-of-00030.safetensors", "model.layers.78.self_attn.k_proj.weight": "model-00029-of-00030.safetensors", "model.layers.78.self_attn.o_proj.weight": "model-00029-of-00030.safetensors", "model.layers.78.self_attn.q_proj.weight": "model-00029-of-00030.safetensors", "model.layers.78.self_attn.v_proj.weight": "model-00029-of-00030.safetensors", "model.layers.79.input_layernorm.weight": "model-00029-of-00030.safetensors", "model.layers.79.mlp.down_proj.weight": "model-00029-of-00030.safetensors", "model.layers.79.mlp.gate_proj.weight": "model-00029-of-00030.safetensors", "model.layers.79.mlp.up_proj.weight": "model-00029-of-00030.safetensors", "model.layers.79.post_attention_layernorm.weight": "model-00029-of-00030.safetensors", "model.layers.79.self_attn.k_proj.weight": "model-00029-of-00030.safetensors", "model.layers.79.self_attn.o_proj.weight": "model-00029-of-00030.safetensors", "model.layers.79.self_attn.q_proj.weight": "model-00029-of-00030.safetensors", "model.layers.79.self_attn.v_proj.weight": "model-00029-of-00030.safetensors", "model.layers.8.input_layernorm.weight": "model-00029-of-00030.safetensors", "model.layers.8.mlp.down_proj.weight": "model-00029-of-00030.safetensors", "model.layers.8.mlp.gate_proj.weight": "model-00029-of-00030.safetensors", "model.layers.8.mlp.up_proj.weight": "model-00029-of-00030.safetensors", "model.layers.8.post_attention_layernorm.weight": "model-00029-of-00030.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00029-of-00030.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00030-of-00030.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00030-of-00030.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00030-of-00030.safetensors", "model.layers.9.input_layernorm.weight": "model-00030-of-00030.safetensors", "model.layers.9.mlp.down_proj.weight": "model-00030-of-00030.safetensors", "model.layers.9.mlp.gate_proj.weight": "model-00030-of-00030.safetensors", "model.layers.9.mlp.up_proj.weight": "model-00030-of-00030.safetensors", "model.layers.9.post_attention_layernorm.weight": "model-00030-of-00030.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00030-of-00030.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00030-of-00030.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00030-of-00030.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00030-of-00030.safetensors", "model.norm.weight": "model-00030-of-00030.safetensors"}}
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|end_of_text|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|end_of_text|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,2064 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "128000": {
4
+ "content": "<|begin_of_text|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "128001": {
12
+ "content": "<|end_of_text|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "128002": {
20
+ "content": "<|reserved_special_token_0|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "128003": {
28
+ "content": "<|reserved_special_token_1|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128004": {
36
+ "content": "<|reserved_special_token_2|>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "128005": {
44
+ "content": "<|reserved_special_token_3|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "128006": {
52
+ "content": "<|start_header_id|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "128007": {
60
+ "content": "<|end_header_id|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "128008": {
68
+ "content": "<|reserved_special_token_4|>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "128009": {
76
+ "content": "<|eot_id|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "128010": {
84
+ "content": "<|reserved_special_token_5|>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "128011": {
92
+ "content": "<|reserved_special_token_6|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "128012": {
100
+ "content": "<|reserved_special_token_7|>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "128013": {
108
+ "content": "<|reserved_special_token_8|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "128014": {
116
+ "content": "<|reserved_special_token_9|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "128015": {
124
+ "content": "<|reserved_special_token_10|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "128016": {
132
+ "content": "<|reserved_special_token_11|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "128017": {
140
+ "content": "<|reserved_special_token_12|>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "128018": {
148
+ "content": "<|reserved_special_token_13|>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "128019": {
156
+ "content": "<|reserved_special_token_14|>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "128020": {
164
+ "content": "<|reserved_special_token_15|>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "128021": {
172
+ "content": "<|reserved_special_token_16|>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "128022": {
180
+ "content": "<|reserved_special_token_17|>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "128023": {
188
+ "content": "<|reserved_special_token_18|>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "128024": {
196
+ "content": "<|reserved_special_token_19|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "128025": {
204
+ "content": "<|reserved_special_token_20|>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "128026": {
212
+ "content": "<|reserved_special_token_21|>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "128027": {
220
+ "content": "<|reserved_special_token_22|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "128028": {
228
+ "content": "<|reserved_special_token_23|>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "128029": {
236
+ "content": "<|reserved_special_token_24|>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "128030": {
244
+ "content": "<|reserved_special_token_25|>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "128031": {
252
+ "content": "<|reserved_special_token_26|>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "128032": {
260
+ "content": "<|reserved_special_token_27|>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "128033": {
268
+ "content": "<|reserved_special_token_28|>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "128034": {
276
+ "content": "<|reserved_special_token_29|>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "128035": {
284
+ "content": "<|reserved_special_token_30|>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "128036": {
292
+ "content": "<|reserved_special_token_31|>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "128037": {
300
+ "content": "<|reserved_special_token_32|>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "128038": {
308
+ "content": "<|reserved_special_token_33|>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "128039": {
316
+ "content": "<|reserved_special_token_34|>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "128040": {
324
+ "content": "<|reserved_special_token_35|>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "128041": {
332
+ "content": "<|reserved_special_token_36|>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "128042": {
340
+ "content": "<|reserved_special_token_37|>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "128043": {
348
+ "content": "<|reserved_special_token_38|>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "128044": {
356
+ "content": "<|reserved_special_token_39|>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "128045": {
364
+ "content": "<|reserved_special_token_40|>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "128046": {
372
+ "content": "<|reserved_special_token_41|>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "128047": {
380
+ "content": "<|reserved_special_token_42|>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "128048": {
388
+ "content": "<|reserved_special_token_43|>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "128049": {
396
+ "content": "<|reserved_special_token_44|>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "128050": {
404
+ "content": "<|reserved_special_token_45|>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "128051": {
412
+ "content": "<|reserved_special_token_46|>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "128052": {
420
+ "content": "<|reserved_special_token_47|>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "128053": {
428
+ "content": "<|reserved_special_token_48|>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "128054": {
436
+ "content": "<|reserved_special_token_49|>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "128055": {
444
+ "content": "<|reserved_special_token_50|>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "128056": {
452
+ "content": "<|reserved_special_token_51|>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "128057": {
460
+ "content": "<|reserved_special_token_52|>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "128058": {
468
+ "content": "<|reserved_special_token_53|>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "128059": {
476
+ "content": "<|reserved_special_token_54|>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "128060": {
484
+ "content": "<|reserved_special_token_55|>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "128061": {
492
+ "content": "<|reserved_special_token_56|>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "128062": {
500
+ "content": "<|reserved_special_token_57|>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "128063": {
508
+ "content": "<|reserved_special_token_58|>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "128064": {
516
+ "content": "<|reserved_special_token_59|>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "128065": {
524
+ "content": "<|reserved_special_token_60|>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "128066": {
532
+ "content": "<|reserved_special_token_61|>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "128067": {
540
+ "content": "<|reserved_special_token_62|>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "128068": {
548
+ "content": "<|reserved_special_token_63|>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "128069": {
556
+ "content": "<|reserved_special_token_64|>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "128070": {
564
+ "content": "<|reserved_special_token_65|>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "128071": {
572
+ "content": "<|reserved_special_token_66|>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "128072": {
580
+ "content": "<|reserved_special_token_67|>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "128073": {
588
+ "content": "<|reserved_special_token_68|>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "128074": {
596
+ "content": "<|reserved_special_token_69|>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "128075": {
604
+ "content": "<|reserved_special_token_70|>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "128076": {
612
+ "content": "<|reserved_special_token_71|>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "128077": {
620
+ "content": "<|reserved_special_token_72|>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "128078": {
628
+ "content": "<|reserved_special_token_73|>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "128079": {
636
+ "content": "<|reserved_special_token_74|>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "128080": {
644
+ "content": "<|reserved_special_token_75|>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "128081": {
652
+ "content": "<|reserved_special_token_76|>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "128082": {
660
+ "content": "<|reserved_special_token_77|>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "128083": {
668
+ "content": "<|reserved_special_token_78|>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "128084": {
676
+ "content": "<|reserved_special_token_79|>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "128085": {
684
+ "content": "<|reserved_special_token_80|>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "128086": {
692
+ "content": "<|reserved_special_token_81|>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "128087": {
700
+ "content": "<|reserved_special_token_82|>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "128088": {
708
+ "content": "<|reserved_special_token_83|>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "128089": {
716
+ "content": "<|reserved_special_token_84|>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "128090": {
724
+ "content": "<|reserved_special_token_85|>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "128091": {
732
+ "content": "<|reserved_special_token_86|>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "128092": {
740
+ "content": "<|reserved_special_token_87|>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "128093": {
748
+ "content": "<|reserved_special_token_88|>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "128094": {
756
+ "content": "<|reserved_special_token_89|>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "128095": {
764
+ "content": "<|reserved_special_token_90|>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "128096": {
772
+ "content": "<|reserved_special_token_91|>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "128097": {
780
+ "content": "<|reserved_special_token_92|>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "128098": {
788
+ "content": "<|reserved_special_token_93|>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "128099": {
796
+ "content": "<|reserved_special_token_94|>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "128100": {
804
+ "content": "<|reserved_special_token_95|>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "128101": {
812
+ "content": "<|reserved_special_token_96|>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "128102": {
820
+ "content": "<|reserved_special_token_97|>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "128103": {
828
+ "content": "<|reserved_special_token_98|>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "128104": {
836
+ "content": "<|reserved_special_token_99|>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "128105": {
844
+ "content": "<|reserved_special_token_100|>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "128106": {
852
+ "content": "<|reserved_special_token_101|>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "128107": {
860
+ "content": "<|reserved_special_token_102|>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "128108": {
868
+ "content": "<|reserved_special_token_103|>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "128109": {
876
+ "content": "<|reserved_special_token_104|>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "128110": {
884
+ "content": "<|reserved_special_token_105|>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "128111": {
892
+ "content": "<|reserved_special_token_106|>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "128112": {
900
+ "content": "<|reserved_special_token_107|>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "128113": {
908
+ "content": "<|reserved_special_token_108|>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "128114": {
916
+ "content": "<|reserved_special_token_109|>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "128115": {
924
+ "content": "<|reserved_special_token_110|>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "128116": {
932
+ "content": "<|reserved_special_token_111|>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "128117": {
940
+ "content": "<|reserved_special_token_112|>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "128118": {
948
+ "content": "<|reserved_special_token_113|>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "128119": {
956
+ "content": "<|reserved_special_token_114|>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "128120": {
964
+ "content": "<|reserved_special_token_115|>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "128121": {
972
+ "content": "<|reserved_special_token_116|>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "128122": {
980
+ "content": "<|reserved_special_token_117|>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "128123": {
988
+ "content": "<|reserved_special_token_118|>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "128124": {
996
+ "content": "<|reserved_special_token_119|>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "128125": {
1004
+ "content": "<|reserved_special_token_120|>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "128126": {
1012
+ "content": "<|reserved_special_token_121|>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "128127": {
1020
+ "content": "<|reserved_special_token_122|>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ },
1027
+ "128128": {
1028
+ "content": "<|reserved_special_token_123|>",
1029
+ "lstrip": false,
1030
+ "normalized": false,
1031
+ "rstrip": false,
1032
+ "single_word": false,
1033
+ "special": true
1034
+ },
1035
+ "128129": {
1036
+ "content": "<|reserved_special_token_124|>",
1037
+ "lstrip": false,
1038
+ "normalized": false,
1039
+ "rstrip": false,
1040
+ "single_word": false,
1041
+ "special": true
1042
+ },
1043
+ "128130": {
1044
+ "content": "<|reserved_special_token_125|>",
1045
+ "lstrip": false,
1046
+ "normalized": false,
1047
+ "rstrip": false,
1048
+ "single_word": false,
1049
+ "special": true
1050
+ },
1051
+ "128131": {
1052
+ "content": "<|reserved_special_token_126|>",
1053
+ "lstrip": false,
1054
+ "normalized": false,
1055
+ "rstrip": false,
1056
+ "single_word": false,
1057
+ "special": true
1058
+ },
1059
+ "128132": {
1060
+ "content": "<|reserved_special_token_127|>",
1061
+ "lstrip": false,
1062
+ "normalized": false,
1063
+ "rstrip": false,
1064
+ "single_word": false,
1065
+ "special": true
1066
+ },
1067
+ "128133": {
1068
+ "content": "<|reserved_special_token_128|>",
1069
+ "lstrip": false,
1070
+ "normalized": false,
1071
+ "rstrip": false,
1072
+ "single_word": false,
1073
+ "special": true
1074
+ },
1075
+ "128134": {
1076
+ "content": "<|reserved_special_token_129|>",
1077
+ "lstrip": false,
1078
+ "normalized": false,
1079
+ "rstrip": false,
1080
+ "single_word": false,
1081
+ "special": true
1082
+ },
1083
+ "128135": {
1084
+ "content": "<|reserved_special_token_130|>",
1085
+ "lstrip": false,
1086
+ "normalized": false,
1087
+ "rstrip": false,
1088
+ "single_word": false,
1089
+ "special": true
1090
+ },
1091
+ "128136": {
1092
+ "content": "<|reserved_special_token_131|>",
1093
+ "lstrip": false,
1094
+ "normalized": false,
1095
+ "rstrip": false,
1096
+ "single_word": false,
1097
+ "special": true
1098
+ },
1099
+ "128137": {
1100
+ "content": "<|reserved_special_token_132|>",
1101
+ "lstrip": false,
1102
+ "normalized": false,
1103
+ "rstrip": false,
1104
+ "single_word": false,
1105
+ "special": true
1106
+ },
1107
+ "128138": {
1108
+ "content": "<|reserved_special_token_133|>",
1109
+ "lstrip": false,
1110
+ "normalized": false,
1111
+ "rstrip": false,
1112
+ "single_word": false,
1113
+ "special": true
1114
+ },
1115
+ "128139": {
1116
+ "content": "<|reserved_special_token_134|>",
1117
+ "lstrip": false,
1118
+ "normalized": false,
1119
+ "rstrip": false,
1120
+ "single_word": false,
1121
+ "special": true
1122
+ },
1123
+ "128140": {
1124
+ "content": "<|reserved_special_token_135|>",
1125
+ "lstrip": false,
1126
+ "normalized": false,
1127
+ "rstrip": false,
1128
+ "single_word": false,
1129
+ "special": true
1130
+ },
1131
+ "128141": {
1132
+ "content": "<|reserved_special_token_136|>",
1133
+ "lstrip": false,
1134
+ "normalized": false,
1135
+ "rstrip": false,
1136
+ "single_word": false,
1137
+ "special": true
1138
+ },
1139
+ "128142": {
1140
+ "content": "<|reserved_special_token_137|>",
1141
+ "lstrip": false,
1142
+ "normalized": false,
1143
+ "rstrip": false,
1144
+ "single_word": false,
1145
+ "special": true
1146
+ },
1147
+ "128143": {
1148
+ "content": "<|reserved_special_token_138|>",
1149
+ "lstrip": false,
1150
+ "normalized": false,
1151
+ "rstrip": false,
1152
+ "single_word": false,
1153
+ "special": true
1154
+ },
1155
+ "128144": {
1156
+ "content": "<|reserved_special_token_139|>",
1157
+ "lstrip": false,
1158
+ "normalized": false,
1159
+ "rstrip": false,
1160
+ "single_word": false,
1161
+ "special": true
1162
+ },
1163
+ "128145": {
1164
+ "content": "<|reserved_special_token_140|>",
1165
+ "lstrip": false,
1166
+ "normalized": false,
1167
+ "rstrip": false,
1168
+ "single_word": false,
1169
+ "special": true
1170
+ },
1171
+ "128146": {
1172
+ "content": "<|reserved_special_token_141|>",
1173
+ "lstrip": false,
1174
+ "normalized": false,
1175
+ "rstrip": false,
1176
+ "single_word": false,
1177
+ "special": true
1178
+ },
1179
+ "128147": {
1180
+ "content": "<|reserved_special_token_142|>",
1181
+ "lstrip": false,
1182
+ "normalized": false,
1183
+ "rstrip": false,
1184
+ "single_word": false,
1185
+ "special": true
1186
+ },
1187
+ "128148": {
1188
+ "content": "<|reserved_special_token_143|>",
1189
+ "lstrip": false,
1190
+ "normalized": false,
1191
+ "rstrip": false,
1192
+ "single_word": false,
1193
+ "special": true
1194
+ },
1195
+ "128149": {
1196
+ "content": "<|reserved_special_token_144|>",
1197
+ "lstrip": false,
1198
+ "normalized": false,
1199
+ "rstrip": false,
1200
+ "single_word": false,
1201
+ "special": true
1202
+ },
1203
+ "128150": {
1204
+ "content": "<|reserved_special_token_145|>",
1205
+ "lstrip": false,
1206
+ "normalized": false,
1207
+ "rstrip": false,
1208
+ "single_word": false,
1209
+ "special": true
1210
+ },
1211
+ "128151": {
1212
+ "content": "<|reserved_special_token_146|>",
1213
+ "lstrip": false,
1214
+ "normalized": false,
1215
+ "rstrip": false,
1216
+ "single_word": false,
1217
+ "special": true
1218
+ },
1219
+ "128152": {
1220
+ "content": "<|reserved_special_token_147|>",
1221
+ "lstrip": false,
1222
+ "normalized": false,
1223
+ "rstrip": false,
1224
+ "single_word": false,
1225
+ "special": true
1226
+ },
1227
+ "128153": {
1228
+ "content": "<|reserved_special_token_148|>",
1229
+ "lstrip": false,
1230
+ "normalized": false,
1231
+ "rstrip": false,
1232
+ "single_word": false,
1233
+ "special": true
1234
+ },
1235
+ "128154": {
1236
+ "content": "<|reserved_special_token_149|>",
1237
+ "lstrip": false,
1238
+ "normalized": false,
1239
+ "rstrip": false,
1240
+ "single_word": false,
1241
+ "special": true
1242
+ },
1243
+ "128155": {
1244
+ "content": "<|reserved_special_token_150|>",
1245
+ "lstrip": false,
1246
+ "normalized": false,
1247
+ "rstrip": false,
1248
+ "single_word": false,
1249
+ "special": true
1250
+ },
1251
+ "128156": {
1252
+ "content": "<|reserved_special_token_151|>",
1253
+ "lstrip": false,
1254
+ "normalized": false,
1255
+ "rstrip": false,
1256
+ "single_word": false,
1257
+ "special": true
1258
+ },
1259
+ "128157": {
1260
+ "content": "<|reserved_special_token_152|>",
1261
+ "lstrip": false,
1262
+ "normalized": false,
1263
+ "rstrip": false,
1264
+ "single_word": false,
1265
+ "special": true
1266
+ },
1267
+ "128158": {
1268
+ "content": "<|reserved_special_token_153|>",
1269
+ "lstrip": false,
1270
+ "normalized": false,
1271
+ "rstrip": false,
1272
+ "single_word": false,
1273
+ "special": true
1274
+ },
1275
+ "128159": {
1276
+ "content": "<|reserved_special_token_154|>",
1277
+ "lstrip": false,
1278
+ "normalized": false,
1279
+ "rstrip": false,
1280
+ "single_word": false,
1281
+ "special": true
1282
+ },
1283
+ "128160": {
1284
+ "content": "<|reserved_special_token_155|>",
1285
+ "lstrip": false,
1286
+ "normalized": false,
1287
+ "rstrip": false,
1288
+ "single_word": false,
1289
+ "special": true
1290
+ },
1291
+ "128161": {
1292
+ "content": "<|reserved_special_token_156|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false,
1297
+ "special": true
1298
+ },
1299
+ "128162": {
1300
+ "content": "<|reserved_special_token_157|>",
1301
+ "lstrip": false,
1302
+ "normalized": false,
1303
+ "rstrip": false,
1304
+ "single_word": false,
1305
+ "special": true
1306
+ },
1307
+ "128163": {
1308
+ "content": "<|reserved_special_token_158|>",
1309
+ "lstrip": false,
1310
+ "normalized": false,
1311
+ "rstrip": false,
1312
+ "single_word": false,
1313
+ "special": true
1314
+ },
1315
+ "128164": {
1316
+ "content": "<|reserved_special_token_159|>",
1317
+ "lstrip": false,
1318
+ "normalized": false,
1319
+ "rstrip": false,
1320
+ "single_word": false,
1321
+ "special": true
1322
+ },
1323
+ "128165": {
1324
+ "content": "<|reserved_special_token_160|>",
1325
+ "lstrip": false,
1326
+ "normalized": false,
1327
+ "rstrip": false,
1328
+ "single_word": false,
1329
+ "special": true
1330
+ },
1331
+ "128166": {
1332
+ "content": "<|reserved_special_token_161|>",
1333
+ "lstrip": false,
1334
+ "normalized": false,
1335
+ "rstrip": false,
1336
+ "single_word": false,
1337
+ "special": true
1338
+ },
1339
+ "128167": {
1340
+ "content": "<|reserved_special_token_162|>",
1341
+ "lstrip": false,
1342
+ "normalized": false,
1343
+ "rstrip": false,
1344
+ "single_word": false,
1345
+ "special": true
1346
+ },
1347
+ "128168": {
1348
+ "content": "<|reserved_special_token_163|>",
1349
+ "lstrip": false,
1350
+ "normalized": false,
1351
+ "rstrip": false,
1352
+ "single_word": false,
1353
+ "special": true
1354
+ },
1355
+ "128169": {
1356
+ "content": "<|reserved_special_token_164|>",
1357
+ "lstrip": false,
1358
+ "normalized": false,
1359
+ "rstrip": false,
1360
+ "single_word": false,
1361
+ "special": true
1362
+ },
1363
+ "128170": {
1364
+ "content": "<|reserved_special_token_165|>",
1365
+ "lstrip": false,
1366
+ "normalized": false,
1367
+ "rstrip": false,
1368
+ "single_word": false,
1369
+ "special": true
1370
+ },
1371
+ "128171": {
1372
+ "content": "<|reserved_special_token_166|>",
1373
+ "lstrip": false,
1374
+ "normalized": false,
1375
+ "rstrip": false,
1376
+ "single_word": false,
1377
+ "special": true
1378
+ },
1379
+ "128172": {
1380
+ "content": "<|reserved_special_token_167|>",
1381
+ "lstrip": false,
1382
+ "normalized": false,
1383
+ "rstrip": false,
1384
+ "single_word": false,
1385
+ "special": true
1386
+ },
1387
+ "128173": {
1388
+ "content": "<|reserved_special_token_168|>",
1389
+ "lstrip": false,
1390
+ "normalized": false,
1391
+ "rstrip": false,
1392
+ "single_word": false,
1393
+ "special": true
1394
+ },
1395
+ "128174": {
1396
+ "content": "<|reserved_special_token_169|>",
1397
+ "lstrip": false,
1398
+ "normalized": false,
1399
+ "rstrip": false,
1400
+ "single_word": false,
1401
+ "special": true
1402
+ },
1403
+ "128175": {
1404
+ "content": "<|reserved_special_token_170|>",
1405
+ "lstrip": false,
1406
+ "normalized": false,
1407
+ "rstrip": false,
1408
+ "single_word": false,
1409
+ "special": true
1410
+ },
1411
+ "128176": {
1412
+ "content": "<|reserved_special_token_171|>",
1413
+ "lstrip": false,
1414
+ "normalized": false,
1415
+ "rstrip": false,
1416
+ "single_word": false,
1417
+ "special": true
1418
+ },
1419
+ "128177": {
1420
+ "content": "<|reserved_special_token_172|>",
1421
+ "lstrip": false,
1422
+ "normalized": false,
1423
+ "rstrip": false,
1424
+ "single_word": false,
1425
+ "special": true
1426
+ },
1427
+ "128178": {
1428
+ "content": "<|reserved_special_token_173|>",
1429
+ "lstrip": false,
1430
+ "normalized": false,
1431
+ "rstrip": false,
1432
+ "single_word": false,
1433
+ "special": true
1434
+ },
1435
+ "128179": {
1436
+ "content": "<|reserved_special_token_174|>",
1437
+ "lstrip": false,
1438
+ "normalized": false,
1439
+ "rstrip": false,
1440
+ "single_word": false,
1441
+ "special": true
1442
+ },
1443
+ "128180": {
1444
+ "content": "<|reserved_special_token_175|>",
1445
+ "lstrip": false,
1446
+ "normalized": false,
1447
+ "rstrip": false,
1448
+ "single_word": false,
1449
+ "special": true
1450
+ },
1451
+ "128181": {
1452
+ "content": "<|reserved_special_token_176|>",
1453
+ "lstrip": false,
1454
+ "normalized": false,
1455
+ "rstrip": false,
1456
+ "single_word": false,
1457
+ "special": true
1458
+ },
1459
+ "128182": {
1460
+ "content": "<|reserved_special_token_177|>",
1461
+ "lstrip": false,
1462
+ "normalized": false,
1463
+ "rstrip": false,
1464
+ "single_word": false,
1465
+ "special": true
1466
+ },
1467
+ "128183": {
1468
+ "content": "<|reserved_special_token_178|>",
1469
+ "lstrip": false,
1470
+ "normalized": false,
1471
+ "rstrip": false,
1472
+ "single_word": false,
1473
+ "special": true
1474
+ },
1475
+ "128184": {
1476
+ "content": "<|reserved_special_token_179|>",
1477
+ "lstrip": false,
1478
+ "normalized": false,
1479
+ "rstrip": false,
1480
+ "single_word": false,
1481
+ "special": true
1482
+ },
1483
+ "128185": {
1484
+ "content": "<|reserved_special_token_180|>",
1485
+ "lstrip": false,
1486
+ "normalized": false,
1487
+ "rstrip": false,
1488
+ "single_word": false,
1489
+ "special": true
1490
+ },
1491
+ "128186": {
1492
+ "content": "<|reserved_special_token_181|>",
1493
+ "lstrip": false,
1494
+ "normalized": false,
1495
+ "rstrip": false,
1496
+ "single_word": false,
1497
+ "special": true
1498
+ },
1499
+ "128187": {
1500
+ "content": "<|reserved_special_token_182|>",
1501
+ "lstrip": false,
1502
+ "normalized": false,
1503
+ "rstrip": false,
1504
+ "single_word": false,
1505
+ "special": true
1506
+ },
1507
+ "128188": {
1508
+ "content": "<|reserved_special_token_183|>",
1509
+ "lstrip": false,
1510
+ "normalized": false,
1511
+ "rstrip": false,
1512
+ "single_word": false,
1513
+ "special": true
1514
+ },
1515
+ "128189": {
1516
+ "content": "<|reserved_special_token_184|>",
1517
+ "lstrip": false,
1518
+ "normalized": false,
1519
+ "rstrip": false,
1520
+ "single_word": false,
1521
+ "special": true
1522
+ },
1523
+ "128190": {
1524
+ "content": "<|reserved_special_token_185|>",
1525
+ "lstrip": false,
1526
+ "normalized": false,
1527
+ "rstrip": false,
1528
+ "single_word": false,
1529
+ "special": true
1530
+ },
1531
+ "128191": {
1532
+ "content": "<|reserved_special_token_186|>",
1533
+ "lstrip": false,
1534
+ "normalized": false,
1535
+ "rstrip": false,
1536
+ "single_word": false,
1537
+ "special": true
1538
+ },
1539
+ "128192": {
1540
+ "content": "<|reserved_special_token_187|>",
1541
+ "lstrip": false,
1542
+ "normalized": false,
1543
+ "rstrip": false,
1544
+ "single_word": false,
1545
+ "special": true
1546
+ },
1547
+ "128193": {
1548
+ "content": "<|reserved_special_token_188|>",
1549
+ "lstrip": false,
1550
+ "normalized": false,
1551
+ "rstrip": false,
1552
+ "single_word": false,
1553
+ "special": true
1554
+ },
1555
+ "128194": {
1556
+ "content": "<|reserved_special_token_189|>",
1557
+ "lstrip": false,
1558
+ "normalized": false,
1559
+ "rstrip": false,
1560
+ "single_word": false,
1561
+ "special": true
1562
+ },
1563
+ "128195": {
1564
+ "content": "<|reserved_special_token_190|>",
1565
+ "lstrip": false,
1566
+ "normalized": false,
1567
+ "rstrip": false,
1568
+ "single_word": false,
1569
+ "special": true
1570
+ },
1571
+ "128196": {
1572
+ "content": "<|reserved_special_token_191|>",
1573
+ "lstrip": false,
1574
+ "normalized": false,
1575
+ "rstrip": false,
1576
+ "single_word": false,
1577
+ "special": true
1578
+ },
1579
+ "128197": {
1580
+ "content": "<|reserved_special_token_192|>",
1581
+ "lstrip": false,
1582
+ "normalized": false,
1583
+ "rstrip": false,
1584
+ "single_word": false,
1585
+ "special": true
1586
+ },
1587
+ "128198": {
1588
+ "content": "<|reserved_special_token_193|>",
1589
+ "lstrip": false,
1590
+ "normalized": false,
1591
+ "rstrip": false,
1592
+ "single_word": false,
1593
+ "special": true
1594
+ },
1595
+ "128199": {
1596
+ "content": "<|reserved_special_token_194|>",
1597
+ "lstrip": false,
1598
+ "normalized": false,
1599
+ "rstrip": false,
1600
+ "single_word": false,
1601
+ "special": true
1602
+ },
1603
+ "128200": {
1604
+ "content": "<|reserved_special_token_195|>",
1605
+ "lstrip": false,
1606
+ "normalized": false,
1607
+ "rstrip": false,
1608
+ "single_word": false,
1609
+ "special": true
1610
+ },
1611
+ "128201": {
1612
+ "content": "<|reserved_special_token_196|>",
1613
+ "lstrip": false,
1614
+ "normalized": false,
1615
+ "rstrip": false,
1616
+ "single_word": false,
1617
+ "special": true
1618
+ },
1619
+ "128202": {
1620
+ "content": "<|reserved_special_token_197|>",
1621
+ "lstrip": false,
1622
+ "normalized": false,
1623
+ "rstrip": false,
1624
+ "single_word": false,
1625
+ "special": true
1626
+ },
1627
+ "128203": {
1628
+ "content": "<|reserved_special_token_198|>",
1629
+ "lstrip": false,
1630
+ "normalized": false,
1631
+ "rstrip": false,
1632
+ "single_word": false,
1633
+ "special": true
1634
+ },
1635
+ "128204": {
1636
+ "content": "<|reserved_special_token_199|>",
1637
+ "lstrip": false,
1638
+ "normalized": false,
1639
+ "rstrip": false,
1640
+ "single_word": false,
1641
+ "special": true
1642
+ },
1643
+ "128205": {
1644
+ "content": "<|reserved_special_token_200|>",
1645
+ "lstrip": false,
1646
+ "normalized": false,
1647
+ "rstrip": false,
1648
+ "single_word": false,
1649
+ "special": true
1650
+ },
1651
+ "128206": {
1652
+ "content": "<|reserved_special_token_201|>",
1653
+ "lstrip": false,
1654
+ "normalized": false,
1655
+ "rstrip": false,
1656
+ "single_word": false,
1657
+ "special": true
1658
+ },
1659
+ "128207": {
1660
+ "content": "<|reserved_special_token_202|>",
1661
+ "lstrip": false,
1662
+ "normalized": false,
1663
+ "rstrip": false,
1664
+ "single_word": false,
1665
+ "special": true
1666
+ },
1667
+ "128208": {
1668
+ "content": "<|reserved_special_token_203|>",
1669
+ "lstrip": false,
1670
+ "normalized": false,
1671
+ "rstrip": false,
1672
+ "single_word": false,
1673
+ "special": true
1674
+ },
1675
+ "128209": {
1676
+ "content": "<|reserved_special_token_204|>",
1677
+ "lstrip": false,
1678
+ "normalized": false,
1679
+ "rstrip": false,
1680
+ "single_word": false,
1681
+ "special": true
1682
+ },
1683
+ "128210": {
1684
+ "content": "<|reserved_special_token_205|>",
1685
+ "lstrip": false,
1686
+ "normalized": false,
1687
+ "rstrip": false,
1688
+ "single_word": false,
1689
+ "special": true
1690
+ },
1691
+ "128211": {
1692
+ "content": "<|reserved_special_token_206|>",
1693
+ "lstrip": false,
1694
+ "normalized": false,
1695
+ "rstrip": false,
1696
+ "single_word": false,
1697
+ "special": true
1698
+ },
1699
+ "128212": {
1700
+ "content": "<|reserved_special_token_207|>",
1701
+ "lstrip": false,
1702
+ "normalized": false,
1703
+ "rstrip": false,
1704
+ "single_word": false,
1705
+ "special": true
1706
+ },
1707
+ "128213": {
1708
+ "content": "<|reserved_special_token_208|>",
1709
+ "lstrip": false,
1710
+ "normalized": false,
1711
+ "rstrip": false,
1712
+ "single_word": false,
1713
+ "special": true
1714
+ },
1715
+ "128214": {
1716
+ "content": "<|reserved_special_token_209|>",
1717
+ "lstrip": false,
1718
+ "normalized": false,
1719
+ "rstrip": false,
1720
+ "single_word": false,
1721
+ "special": true
1722
+ },
1723
+ "128215": {
1724
+ "content": "<|reserved_special_token_210|>",
1725
+ "lstrip": false,
1726
+ "normalized": false,
1727
+ "rstrip": false,
1728
+ "single_word": false,
1729
+ "special": true
1730
+ },
1731
+ "128216": {
1732
+ "content": "<|reserved_special_token_211|>",
1733
+ "lstrip": false,
1734
+ "normalized": false,
1735
+ "rstrip": false,
1736
+ "single_word": false,
1737
+ "special": true
1738
+ },
1739
+ "128217": {
1740
+ "content": "<|reserved_special_token_212|>",
1741
+ "lstrip": false,
1742
+ "normalized": false,
1743
+ "rstrip": false,
1744
+ "single_word": false,
1745
+ "special": true
1746
+ },
1747
+ "128218": {
1748
+ "content": "<|reserved_special_token_213|>",
1749
+ "lstrip": false,
1750
+ "normalized": false,
1751
+ "rstrip": false,
1752
+ "single_word": false,
1753
+ "special": true
1754
+ },
1755
+ "128219": {
1756
+ "content": "<|reserved_special_token_214|>",
1757
+ "lstrip": false,
1758
+ "normalized": false,
1759
+ "rstrip": false,
1760
+ "single_word": false,
1761
+ "special": true
1762
+ },
1763
+ "128220": {
1764
+ "content": "<|reserved_special_token_215|>",
1765
+ "lstrip": false,
1766
+ "normalized": false,
1767
+ "rstrip": false,
1768
+ "single_word": false,
1769
+ "special": true
1770
+ },
1771
+ "128221": {
1772
+ "content": "<|reserved_special_token_216|>",
1773
+ "lstrip": false,
1774
+ "normalized": false,
1775
+ "rstrip": false,
1776
+ "single_word": false,
1777
+ "special": true
1778
+ },
1779
+ "128222": {
1780
+ "content": "<|reserved_special_token_217|>",
1781
+ "lstrip": false,
1782
+ "normalized": false,
1783
+ "rstrip": false,
1784
+ "single_word": false,
1785
+ "special": true
1786
+ },
1787
+ "128223": {
1788
+ "content": "<|reserved_special_token_218|>",
1789
+ "lstrip": false,
1790
+ "normalized": false,
1791
+ "rstrip": false,
1792
+ "single_word": false,
1793
+ "special": true
1794
+ },
1795
+ "128224": {
1796
+ "content": "<|reserved_special_token_219|>",
1797
+ "lstrip": false,
1798
+ "normalized": false,
1799
+ "rstrip": false,
1800
+ "single_word": false,
1801
+ "special": true
1802
+ },
1803
+ "128225": {
1804
+ "content": "<|reserved_special_token_220|>",
1805
+ "lstrip": false,
1806
+ "normalized": false,
1807
+ "rstrip": false,
1808
+ "single_word": false,
1809
+ "special": true
1810
+ },
1811
+ "128226": {
1812
+ "content": "<|reserved_special_token_221|>",
1813
+ "lstrip": false,
1814
+ "normalized": false,
1815
+ "rstrip": false,
1816
+ "single_word": false,
1817
+ "special": true
1818
+ },
1819
+ "128227": {
1820
+ "content": "<|reserved_special_token_222|>",
1821
+ "lstrip": false,
1822
+ "normalized": false,
1823
+ "rstrip": false,
1824
+ "single_word": false,
1825
+ "special": true
1826
+ },
1827
+ "128228": {
1828
+ "content": "<|reserved_special_token_223|>",
1829
+ "lstrip": false,
1830
+ "normalized": false,
1831
+ "rstrip": false,
1832
+ "single_word": false,
1833
+ "special": true
1834
+ },
1835
+ "128229": {
1836
+ "content": "<|reserved_special_token_224|>",
1837
+ "lstrip": false,
1838
+ "normalized": false,
1839
+ "rstrip": false,
1840
+ "single_word": false,
1841
+ "special": true
1842
+ },
1843
+ "128230": {
1844
+ "content": "<|reserved_special_token_225|>",
1845
+ "lstrip": false,
1846
+ "normalized": false,
1847
+ "rstrip": false,
1848
+ "single_word": false,
1849
+ "special": true
1850
+ },
1851
+ "128231": {
1852
+ "content": "<|reserved_special_token_226|>",
1853
+ "lstrip": false,
1854
+ "normalized": false,
1855
+ "rstrip": false,
1856
+ "single_word": false,
1857
+ "special": true
1858
+ },
1859
+ "128232": {
1860
+ "content": "<|reserved_special_token_227|>",
1861
+ "lstrip": false,
1862
+ "normalized": false,
1863
+ "rstrip": false,
1864
+ "single_word": false,
1865
+ "special": true
1866
+ },
1867
+ "128233": {
1868
+ "content": "<|reserved_special_token_228|>",
1869
+ "lstrip": false,
1870
+ "normalized": false,
1871
+ "rstrip": false,
1872
+ "single_word": false,
1873
+ "special": true
1874
+ },
1875
+ "128234": {
1876
+ "content": "<|reserved_special_token_229|>",
1877
+ "lstrip": false,
1878
+ "normalized": false,
1879
+ "rstrip": false,
1880
+ "single_word": false,
1881
+ "special": true
1882
+ },
1883
+ "128235": {
1884
+ "content": "<|reserved_special_token_230|>",
1885
+ "lstrip": false,
1886
+ "normalized": false,
1887
+ "rstrip": false,
1888
+ "single_word": false,
1889
+ "special": true
1890
+ },
1891
+ "128236": {
1892
+ "content": "<|reserved_special_token_231|>",
1893
+ "lstrip": false,
1894
+ "normalized": false,
1895
+ "rstrip": false,
1896
+ "single_word": false,
1897
+ "special": true
1898
+ },
1899
+ "128237": {
1900
+ "content": "<|reserved_special_token_232|>",
1901
+ "lstrip": false,
1902
+ "normalized": false,
1903
+ "rstrip": false,
1904
+ "single_word": false,
1905
+ "special": true
1906
+ },
1907
+ "128238": {
1908
+ "content": "<|reserved_special_token_233|>",
1909
+ "lstrip": false,
1910
+ "normalized": false,
1911
+ "rstrip": false,
1912
+ "single_word": false,
1913
+ "special": true
1914
+ },
1915
+ "128239": {
1916
+ "content": "<|reserved_special_token_234|>",
1917
+ "lstrip": false,
1918
+ "normalized": false,
1919
+ "rstrip": false,
1920
+ "single_word": false,
1921
+ "special": true
1922
+ },
1923
+ "128240": {
1924
+ "content": "<|reserved_special_token_235|>",
1925
+ "lstrip": false,
1926
+ "normalized": false,
1927
+ "rstrip": false,
1928
+ "single_word": false,
1929
+ "special": true
1930
+ },
1931
+ "128241": {
1932
+ "content": "<|reserved_special_token_236|>",
1933
+ "lstrip": false,
1934
+ "normalized": false,
1935
+ "rstrip": false,
1936
+ "single_word": false,
1937
+ "special": true
1938
+ },
1939
+ "128242": {
1940
+ "content": "<|reserved_special_token_237|>",
1941
+ "lstrip": false,
1942
+ "normalized": false,
1943
+ "rstrip": false,
1944
+ "single_word": false,
1945
+ "special": true
1946
+ },
1947
+ "128243": {
1948
+ "content": "<|reserved_special_token_238|>",
1949
+ "lstrip": false,
1950
+ "normalized": false,
1951
+ "rstrip": false,
1952
+ "single_word": false,
1953
+ "special": true
1954
+ },
1955
+ "128244": {
1956
+ "content": "<|reserved_special_token_239|>",
1957
+ "lstrip": false,
1958
+ "normalized": false,
1959
+ "rstrip": false,
1960
+ "single_word": false,
1961
+ "special": true
1962
+ },
1963
+ "128245": {
1964
+ "content": "<|reserved_special_token_240|>",
1965
+ "lstrip": false,
1966
+ "normalized": false,
1967
+ "rstrip": false,
1968
+ "single_word": false,
1969
+ "special": true
1970
+ },
1971
+ "128246": {
1972
+ "content": "<|reserved_special_token_241|>",
1973
+ "lstrip": false,
1974
+ "normalized": false,
1975
+ "rstrip": false,
1976
+ "single_word": false,
1977
+ "special": true
1978
+ },
1979
+ "128247": {
1980
+ "content": "<|reserved_special_token_242|>",
1981
+ "lstrip": false,
1982
+ "normalized": false,
1983
+ "rstrip": false,
1984
+ "single_word": false,
1985
+ "special": true
1986
+ },
1987
+ "128248": {
1988
+ "content": "<|reserved_special_token_243|>",
1989
+ "lstrip": false,
1990
+ "normalized": false,
1991
+ "rstrip": false,
1992
+ "single_word": false,
1993
+ "special": true
1994
+ },
1995
+ "128249": {
1996
+ "content": "<|reserved_special_token_244|>",
1997
+ "lstrip": false,
1998
+ "normalized": false,
1999
+ "rstrip": false,
2000
+ "single_word": false,
2001
+ "special": true
2002
+ },
2003
+ "128250": {
2004
+ "content": "<|reserved_special_token_245|>",
2005
+ "lstrip": false,
2006
+ "normalized": false,
2007
+ "rstrip": false,
2008
+ "single_word": false,
2009
+ "special": true
2010
+ },
2011
+ "128251": {
2012
+ "content": "<|reserved_special_token_246|>",
2013
+ "lstrip": false,
2014
+ "normalized": false,
2015
+ "rstrip": false,
2016
+ "single_word": false,
2017
+ "special": true
2018
+ },
2019
+ "128252": {
2020
+ "content": "<|reserved_special_token_247|>",
2021
+ "lstrip": false,
2022
+ "normalized": false,
2023
+ "rstrip": false,
2024
+ "single_word": false,
2025
+ "special": true
2026
+ },
2027
+ "128253": {
2028
+ "content": "<|reserved_special_token_248|>",
2029
+ "lstrip": false,
2030
+ "normalized": false,
2031
+ "rstrip": false,
2032
+ "single_word": false,
2033
+ "special": true
2034
+ },
2035
+ "128254": {
2036
+ "content": "<|reserved_special_token_249|>",
2037
+ "lstrip": false,
2038
+ "normalized": false,
2039
+ "rstrip": false,
2040
+ "single_word": false,
2041
+ "special": true
2042
+ },
2043
+ "128255": {
2044
+ "content": "<|reserved_special_token_250|>",
2045
+ "lstrip": false,
2046
+ "normalized": false,
2047
+ "rstrip": false,
2048
+ "single_word": false,
2049
+ "special": true
2050
+ }
2051
+ },
2052
+ "bos_token": "<|begin_of_text|>",
2053
+ "chat_template": "{% set loop_messages = messages %}{% for message in loop_messages %}{% set content = '<|start_header_id|>' + message['role'] + '<|end_header_id|>\n\n'+ message['content'] | trim + '<|eot_id|>' %}{% if loop.index0 == 0 %}{% set content = bos_token + content %}{% endif %}{{ content }}{% endfor %}{% if add_generation_prompt %}{{ '<|start_header_id|>assistant<|end_header_id|>\n\n' }}{% endif %}",
2054
+ "clean_up_tokenization_spaces": true,
2055
+ "eos_token": "<|end_of_text|>",
2056
+ "model_input_names": [
2057
+ "input_ids",
2058
+ "attention_mask"
2059
+ ],
2060
+ "model_max_length": 3192,
2061
+ "pad_token": "<|end_of_text|>",
2062
+ "padding_side": "right",
2063
+ "tokenizer_class": "PreTrainedTokenizerFast"
2064
+ }