CultriX commited on
Commit
76e3315
1 Parent(s): 9965c85

Upload folder using huggingface_hub

Browse files
README.md ADDED
@@ -0,0 +1,257 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - moe
5
+ - frankenmoe
6
+ - merge
7
+ - mergekit
8
+ - lazymergekit
9
+ - mlabonne/NeuralBeagle14-7B
10
+ - fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser
11
+ - mlabonne/Marcoro14-7B-slerp
12
+ base_model:
13
+ - mlabonne/NeuralBeagle14-7B
14
+ - fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser
15
+ - mlabonne/Marcoro14-7B-slerp
16
+ ---
17
+
18
+ # CultriX-MoE-BF16
19
+
20
+ CultriX-MoE-BF16 is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
21
+ * [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B)
22
+ * [fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser](https://huggingface.co/fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser)
23
+ * [mlabonne/Marcoro14-7B-slerp](https://huggingface.co/mlabonne/Marcoro14-7B-slerp)
24
+
25
+ ## 🧩 Configuration
26
+
27
+ ```yaml
28
+ base_model: "EmbeddedLLM/Mistral-7B-Merge-14-v0.2"
29
+ gate_mode: hidden
30
+ dtype: bfloat16
31
+ experts:
32
+ - source_model: "mlabonne/NeuralBeagle14-7B"
33
+ positive_prompts:
34
+ - "Create a story based on"
35
+ - "Debate the topic of"
36
+ - "Come up with some arguments"
37
+ - "Provide me with instructions on"
38
+ - "Interpret the sentiment"
39
+ - "Interpret and execute these cooking instructions"
40
+ - "Craft a persuasive argument"
41
+ - "Analyze the motivations"
42
+ - "Construct a detailed plan for"
43
+ - "Narrate an event from multiple perspectives."
44
+ - "Formulate a response"
45
+ - "Write a script for a short play"
46
+ - "Generate a sequence of instructions to teach a skill."
47
+ - "Solve this riddle"
48
+ - "Create an engaging story"
49
+ - "Write a fictional"
50
+ - "Propose a solution to a social issue"
51
+ - "Develop a dialogue"
52
+ - "Create a step-by-step guide"
53
+ - "Devise a strategy"
54
+ - "Write a narrative"
55
+ - "Tell me how to"
56
+ - "Explain the concept of"
57
+ - "Give an overview of"
58
+ - "Compare and contrast between"
59
+ - "Provide information about"
60
+ - "Help me understand"
61
+ - "Summarize"
62
+ - "Make a recommendation on"
63
+ - "Answer this question"
64
+ - "How do you approach"
65
+ - "Explain the concept of"
66
+ - "Give an overview of"
67
+ - "Provide information about"
68
+ - "Help me understand the principles of"
69
+ - "Summarize the key components of"
70
+ - "Make a recommendation on how to"
71
+ - "Answer this question:"
72
+ negative_prompts:
73
+ - "Provide in-depth information about quantum computing."
74
+ - "Explain the inner workings of an internal combustion engine."
75
+ - "Give a detailed tutorial on advanced calculus."
76
+ - "Summarize the latest research in genetic engineering."
77
+ - "Interpret financial markets and stock trends."
78
+ - "Analyze the chemical composition of"
79
+ - "Develop a blueprint for."
80
+ - "Offer a critique of a modern art piece."
81
+ - "Provide a technical review of"
82
+ - "Conduct a linguistic analysis of an ancient language."
83
+ - "Write a user manual for advanced medical equipment."
84
+ - "Give a step-by-step guide on piloting an aircraft."
85
+ - "Conduct an in-depth analysis of this code"
86
+ - "Explain the physics behind black holes."
87
+ - "Provide a strategy for managing a cyber attack"
88
+ - "Develop an algorithm for predictive analytics in finance."
89
+ - "Provide information about advanced programming algorithms."
90
+ - "Help me understand the details of this code"
91
+ - "Summarize the process of cellular respiration."
92
+ - "Improve the security of"
93
+ - "What are the latest advancements in artificial intelligence?"
94
+ - "Provide detailed technical coding solutions."
95
+ - "Analyze complex scientific data and statistics."
96
+ - "Offer medical diagnoses based on symptoms."
97
+ - "Conduct a detailed financial audit of a company."
98
+ - "Perform real-time translation of multiple languages."
99
+ - "Create high-resolution graphic designs."
100
+ - "Develop complex mathematical proofs."
101
+ - "Offer legal advice on specific cases."
102
+ - "Write a detailed manual on advanced mechanical engineering."
103
+ - "Conduct an in-depth psychological assessment."
104
+ - "Perform a security analysis of a computer network."
105
+ - "Compose an original piece of music."
106
+ - "Plan and execute a scientific experiment."
107
+ - "Provide professional career counseling."
108
+ - "Develop a complex database management system."
109
+ - "Write a software program for data analysis."
110
+ - "Give expert advice on cyber"
111
+ - "Conduct a pentesting security audit"
112
+ - source_model: "fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser"
113
+ positive_prompts:
114
+ - "Provide step-by-step coding instructions for..."
115
+ - "Draft a function with detailed steps in [language]"
116
+ - "Guide me through coding a simple [type of application or script]"
117
+ - "Recommend best practices for code implementation in [context]"
118
+ - "Generate a regex pattern for extracting [specific data]"
119
+ - "Create a regex for matching [pattern]"
120
+ - "Explain the purpose of this regex pattern"
121
+ - "Compose regex for [specific use case]"
122
+ - "Annotate this code with detailed comments for each line"
123
+ - "Add explanatory comments to this script"
124
+ - "Comment on each part of this code for clarity"
125
+ - "Develop a script to [accomplish task]"
126
+ - "Design a database schema for [specific use case]"
127
+ - "Outline secure methods for [specific operation]"
128
+ - "Guide on optimizing [specific aspect] in this code"
129
+ - "Refactor this code for better readability and efficiency"
130
+ - "Compare and contrast these code snippets"
131
+ - "Identify the programming language of this snippet"
132
+ - "Demonstrate the usage of [specific tool/library/API]"
133
+ - "Show implementation steps for this [feature/concept]"
134
+ - "Teach how to use [specific tool/library/framework]"
135
+ - "Generate a README file for this project"
136
+ - "Create a manual page for [specific tool/command]"
137
+ - "Produce comprehensive documentation for this code"
138
+ - "Build detailed documentation for [specific module]"
139
+ - "Explain the underlying concept of this code snippet"
140
+ - "Propose enhancements for this script"
141
+ - "Suggest improvements for this API call integration"
142
+ - "Diagnose and solve this coding issue"
143
+ - "Demonstrate robust error handling in this code"
144
+ - "Debug and resolve issues in this script"
145
+ - "Design a user-friendly GUI for this script's functionality"
146
+ - "Detail the deployment process for this application"
147
+ - "Deploy an app designed to [perform function]"
148
+ - "Set up a web service for [specific purpose]"
149
+ - "Develop a website with [specific features]"
150
+ - "Craft a webpage showcasing [specific content]"
151
+ - "Illustrate data flow in this code architecture"
152
+ - "Convert this code from [language A] to [language B]"
153
+ - "Translate this script into [different programming language]"
154
+ - "Explain resource management techniques in [context]"
155
+ - "Build a basic API endpoint for [functionality]"
156
+ - "Strategies to enhance scalability in [context]"
157
+ - "Conduct a security review for this code"
158
+ - "Enhance security measures in [application/module]"
159
+ - "Set up a development environment for [language/framework]"
160
+ - "Visualize data from [specific dataset]"
161
+ - "Generate a dataset for [specific use case]"
162
+ - "Scripting guide for automating [task/process]"
163
+ - "Utilize this code for [specific purpose]"
164
+ - "Principles of object-oriented programming in [language]"
165
+ - "Create a mobile-responsive layout for this web app"
166
+ - "Explain the debugging process for this code"
167
+ - "Compose code to accomplish [task]"
168
+ - "Guidance on writing code for [specific purpose]"
169
+ - "I need a script for [specific function]"
170
+ - "Clarify the functionality of this code"
171
+ - "What is the purpose of this code segment?"
172
+ - "Enhance this code for [specific improvement]"
173
+ - "Develop a program that [solves problem]"
174
+ - "Code needed for [specific task]"
175
+ - "Program a solution for [problem statement]"
176
+ - "Enhance this function's performance by..."
177
+ - "Refactor code for better readability in [context]"
178
+ - "Craft a custom function for [specific requirement]"
179
+ - "Reduce computational complexity in this algorithm by..."
180
+ - "Extend the codebase to include [new feature]"
181
+ - "Incorporate this API into an existing application"
182
+ - "Assist in troubleshooting and bug fixing for [issue]"
183
+ - "Review and prep this code for deployment"
184
+ - "Analyze error logs for potential issues in [context]"
185
+ - "Create unit tests for [module/component]"
186
+ - "Evaluate methodologies for [problem-solving]"
187
+ - "Research [topic] online"
188
+ - "Utilize the [plugin/tool] to achieve [result]"
189
+ - "Design an efficient search algorithm for [data type]"
190
+ - "Create a web crawler for [specific data extraction]"
191
+ - "Application of web sockets in [real-time scenario]"
192
+ - "Guide to integrating a third-party library in [framework]"
193
+ - "Best practices in API design for [application type]"
194
+ negative_prompts:
195
+ - "Provide a detailed analysis of historical events."
196
+ - "Give medical advice for treating a specific illness."
197
+ - "Write a comprehensive review of a novel."
198
+ - "Explain legal implications of a contract."
199
+ - "Develop a marketing strategy for a new product."
200
+ - "Offer financial advice for stock investments."
201
+ - "Create a recipe for a gourmet dish."
202
+ - "Teach a foreign language lesson."
203
+ - "Compose a symphony or musical piece."
204
+ - "Provide workout plans and fitness coaching."
205
+ - "Conduct a psychological analysis of a character."
206
+ - "Write a script for a movie or play."
207
+ - "Design a blueprint for architectural structures."
208
+ - "Give a tutorial on how to paint a landscape."
209
+ - "Explain quantum physics theories."
210
+ - "Offer career counseling and resume writing tips."
211
+ - "Teach how to repair a car engine."
212
+ - "Plan a travel itinerary for a world tour."
213
+ - "Guide on how to grow organic vegetables."
214
+ - "Discuss political strategies for an election campaign."
215
+ - source_model: "mlabonne/Marcoro14-7B-slerp"
216
+ positive_prompts:
217
+ - "Generate a creative story based on these keywords."
218
+ - "Explain a complex topic in simple terms"
219
+ - "Provide a detailed summary of"
220
+ - "Answer this question with factual accuracy"
221
+ - "Explain the historical significance of"
222
+ - "Provide a truthful and detailed account of"
223
+ - "Develop a strategy for solving a practical problem."
224
+ - "Explain the reasoning behind"
225
+ - "Provide an analysis of a moral dilemma with possible solutions."
226
+ negative_prompts:
227
+ - "imathematical problem-solving."
228
+ - "scientific theory explanations."
229
+ - "high-level abstract reasoning tasks."
230
+ - "professional advice in specialized fields like law or medicine."
231
+ - "provide me with a coding solution for"
232
+ - "Academic research"
233
+ ```
234
+
235
+ ## 💻 Usage
236
+
237
+ ```python
238
+ !pip install -qU transformers bitsandbytes accelerate
239
+
240
+ from transformers import AutoTokenizer
241
+ import transformers
242
+ import torch
243
+
244
+ model = "CultriX/CultriX-MoE-BF16"
245
+
246
+ tokenizer = AutoTokenizer.from_pretrained(model)
247
+ pipeline = transformers.pipeline(
248
+ "text-generation",
249
+ model=model,
250
+ model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
251
+ )
252
+
253
+ messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in less than 100 words."}]
254
+ prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
255
+ outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
256
+ print(outputs[0]["generated_text"])
257
+ ```
config.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "EmbeddedLLM/Mistral-7B-Merge-14-v0.2",
3
+ "architectures": [
4
+ "MixtralForCausalLM"
5
+ ],
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 1,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 4096,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 14336,
13
+ "max_position_embeddings": 32768,
14
+ "model_type": "mixtral",
15
+ "num_attention_heads": 32,
16
+ "num_experts_per_tok": 2,
17
+ "num_hidden_layers": 32,
18
+ "num_key_value_heads": 8,
19
+ "num_local_experts": 3,
20
+ "output_router_logits": false,
21
+ "rms_norm_eps": 1e-05,
22
+ "rope_theta": 10000.0,
23
+ "router_aux_loss_coef": 0.001,
24
+ "sliding_window": null,
25
+ "tie_word_embeddings": false,
26
+ "torch_dtype": "bfloat16",
27
+ "transformers_version": "4.36.2",
28
+ "use_cache": true,
29
+ "vocab_size": 32000
30
+ }
mergekit_moe_config.yml ADDED
@@ -0,0 +1,206 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ base_model: "EmbeddedLLM/Mistral-7B-Merge-14-v0.2"
3
+ gate_mode: hidden
4
+ dtype: bfloat16
5
+ experts:
6
+ - source_model: "mlabonne/NeuralBeagle14-7B"
7
+ positive_prompts:
8
+ - "Create a story based on"
9
+ - "Debate the topic of"
10
+ - "Come up with some arguments"
11
+ - "Provide me with instructions on"
12
+ - "Interpret the sentiment"
13
+ - "Interpret and execute these cooking instructions"
14
+ - "Craft a persuasive argument"
15
+ - "Analyze the motivations"
16
+ - "Construct a detailed plan for"
17
+ - "Narrate an event from multiple perspectives."
18
+ - "Formulate a response"
19
+ - "Write a script for a short play"
20
+ - "Generate a sequence of instructions to teach a skill."
21
+ - "Solve this riddle"
22
+ - "Create an engaging story"
23
+ - "Write a fictional"
24
+ - "Propose a solution to a social issue"
25
+ - "Develop a dialogue"
26
+ - "Create a step-by-step guide"
27
+ - "Devise a strategy"
28
+ - "Write a narrative"
29
+ - "Tell me how to"
30
+ - "Explain the concept of"
31
+ - "Give an overview of"
32
+ - "Compare and contrast between"
33
+ - "Provide information about"
34
+ - "Help me understand"
35
+ - "Summarize"
36
+ - "Make a recommendation on"
37
+ - "Answer this question"
38
+ - "How do you approach"
39
+ - "Explain the concept of"
40
+ - "Give an overview of"
41
+ - "Provide information about"
42
+ - "Help me understand the principles of"
43
+ - "Summarize the key components of"
44
+ - "Make a recommendation on how to"
45
+ - "Answer this question:"
46
+ negative_prompts:
47
+ - "Provide in-depth information about quantum computing."
48
+ - "Explain the inner workings of an internal combustion engine."
49
+ - "Give a detailed tutorial on advanced calculus."
50
+ - "Summarize the latest research in genetic engineering."
51
+ - "Interpret financial markets and stock trends."
52
+ - "Analyze the chemical composition of"
53
+ - "Develop a blueprint for."
54
+ - "Offer a critique of a modern art piece."
55
+ - "Provide a technical review of"
56
+ - "Conduct a linguistic analysis of an ancient language."
57
+ - "Write a user manual for advanced medical equipment."
58
+ - "Give a step-by-step guide on piloting an aircraft."
59
+ - "Conduct an in-depth analysis of this code"
60
+ - "Explain the physics behind black holes."
61
+ - "Provide a strategy for managing a cyber attack"
62
+ - "Develop an algorithm for predictive analytics in finance."
63
+ - "Provide information about advanced programming algorithms."
64
+ - "Help me understand the details of this code"
65
+ - "Summarize the process of cellular respiration."
66
+ - "Improve the security of"
67
+ - "What are the latest advancements in artificial intelligence?"
68
+ - "Provide detailed technical coding solutions."
69
+ - "Analyze complex scientific data and statistics."
70
+ - "Offer medical diagnoses based on symptoms."
71
+ - "Conduct a detailed financial audit of a company."
72
+ - "Perform real-time translation of multiple languages."
73
+ - "Create high-resolution graphic designs."
74
+ - "Develop complex mathematical proofs."
75
+ - "Offer legal advice on specific cases."
76
+ - "Write a detailed manual on advanced mechanical engineering."
77
+ - "Conduct an in-depth psychological assessment."
78
+ - "Perform a security analysis of a computer network."
79
+ - "Compose an original piece of music."
80
+ - "Plan and execute a scientific experiment."
81
+ - "Provide professional career counseling."
82
+ - "Develop a complex database management system."
83
+ - "Write a software program for data analysis."
84
+ - "Give expert advice on cyber"
85
+ - "Conduct a pentesting security audit"
86
+ - source_model: "fblgit/UNA-dolphin-2.6-mistral-7b-dpo-laser"
87
+ positive_prompts:
88
+ - "Provide step-by-step coding instructions for..."
89
+ - "Draft a function with detailed steps in [language]"
90
+ - "Guide me through coding a simple [type of application or script]"
91
+ - "Recommend best practices for code implementation in [context]"
92
+ - "Generate a regex pattern for extracting [specific data]"
93
+ - "Create a regex for matching [pattern]"
94
+ - "Explain the purpose of this regex pattern"
95
+ - "Compose regex for [specific use case]"
96
+ - "Annotate this code with detailed comments for each line"
97
+ - "Add explanatory comments to this script"
98
+ - "Comment on each part of this code for clarity"
99
+ - "Develop a script to [accomplish task]"
100
+ - "Design a database schema for [specific use case]"
101
+ - "Outline secure methods for [specific operation]"
102
+ - "Guide on optimizing [specific aspect] in this code"
103
+ - "Refactor this code for better readability and efficiency"
104
+ - "Compare and contrast these code snippets"
105
+ - "Identify the programming language of this snippet"
106
+ - "Demonstrate the usage of [specific tool/library/API]"
107
+ - "Show implementation steps for this [feature/concept]"
108
+ - "Teach how to use [specific tool/library/framework]"
109
+ - "Generate a README file for this project"
110
+ - "Create a manual page for [specific tool/command]"
111
+ - "Produce comprehensive documentation for this code"
112
+ - "Build detailed documentation for [specific module]"
113
+ - "Explain the underlying concept of this code snippet"
114
+ - "Propose enhancements for this script"
115
+ - "Suggest improvements for this API call integration"
116
+ - "Diagnose and solve this coding issue"
117
+ - "Demonstrate robust error handling in this code"
118
+ - "Debug and resolve issues in this script"
119
+ - "Design a user-friendly GUI for this script's functionality"
120
+ - "Detail the deployment process for this application"
121
+ - "Deploy an app designed to [perform function]"
122
+ - "Set up a web service for [specific purpose]"
123
+ - "Develop a website with [specific features]"
124
+ - "Craft a webpage showcasing [specific content]"
125
+ - "Illustrate data flow in this code architecture"
126
+ - "Convert this code from [language A] to [language B]"
127
+ - "Translate this script into [different programming language]"
128
+ - "Explain resource management techniques in [context]"
129
+ - "Build a basic API endpoint for [functionality]"
130
+ - "Strategies to enhance scalability in [context]"
131
+ - "Conduct a security review for this code"
132
+ - "Enhance security measures in [application/module]"
133
+ - "Set up a development environment for [language/framework]"
134
+ - "Visualize data from [specific dataset]"
135
+ - "Generate a dataset for [specific use case]"
136
+ - "Scripting guide for automating [task/process]"
137
+ - "Utilize this code for [specific purpose]"
138
+ - "Principles of object-oriented programming in [language]"
139
+ - "Create a mobile-responsive layout for this web app"
140
+ - "Explain the debugging process for this code"
141
+ - "Compose code to accomplish [task]"
142
+ - "Guidance on writing code for [specific purpose]"
143
+ - "I need a script for [specific function]"
144
+ - "Clarify the functionality of this code"
145
+ - "What is the purpose of this code segment?"
146
+ - "Enhance this code for [specific improvement]"
147
+ - "Develop a program that [solves problem]"
148
+ - "Code needed for [specific task]"
149
+ - "Program a solution for [problem statement]"
150
+ - "Enhance this function's performance by..."
151
+ - "Refactor code for better readability in [context]"
152
+ - "Craft a custom function for [specific requirement]"
153
+ - "Reduce computational complexity in this algorithm by..."
154
+ - "Extend the codebase to include [new feature]"
155
+ - "Incorporate this API into an existing application"
156
+ - "Assist in troubleshooting and bug fixing for [issue]"
157
+ - "Review and prep this code for deployment"
158
+ - "Analyze error logs for potential issues in [context]"
159
+ - "Create unit tests for [module/component]"
160
+ - "Evaluate methodologies for [problem-solving]"
161
+ - "Research [topic] online"
162
+ - "Utilize the [plugin/tool] to achieve [result]"
163
+ - "Design an efficient search algorithm for [data type]"
164
+ - "Create a web crawler for [specific data extraction]"
165
+ - "Application of web sockets in [real-time scenario]"
166
+ - "Guide to integrating a third-party library in [framework]"
167
+ - "Best practices in API design for [application type]"
168
+ negative_prompts:
169
+ - "Provide a detailed analysis of historical events."
170
+ - "Give medical advice for treating a specific illness."
171
+ - "Write a comprehensive review of a novel."
172
+ - "Explain legal implications of a contract."
173
+ - "Develop a marketing strategy for a new product."
174
+ - "Offer financial advice for stock investments."
175
+ - "Create a recipe for a gourmet dish."
176
+ - "Teach a foreign language lesson."
177
+ - "Compose a symphony or musical piece."
178
+ - "Provide workout plans and fitness coaching."
179
+ - "Conduct a psychological analysis of a character."
180
+ - "Write a script for a movie or play."
181
+ - "Design a blueprint for architectural structures."
182
+ - "Give a tutorial on how to paint a landscape."
183
+ - "Explain quantum physics theories."
184
+ - "Offer career counseling and resume writing tips."
185
+ - "Teach how to repair a car engine."
186
+ - "Plan a travel itinerary for a world tour."
187
+ - "Guide on how to grow organic vegetables."
188
+ - "Discuss political strategies for an election campaign."
189
+ - source_model: "mlabonne/Marcoro14-7B-slerp"
190
+ positive_prompts:
191
+ - "Generate a creative story based on these keywords."
192
+ - "Explain a complex topic in simple terms"
193
+ - "Provide a detailed summary of"
194
+ - "Answer this question with factual accuracy"
195
+ - "Explain the historical significance of"
196
+ - "Provide a truthful and detailed account of"
197
+ - "Develop a strategy for solving a practical problem."
198
+ - "Explain the reasoning behind"
199
+ - "Provide an analysis of a moral dilemma with possible solutions."
200
+ negative_prompts:
201
+ - "imathematical problem-solving."
202
+ - "scientific theory explanations."
203
+ - "high-level abstract reasoning tasks."
204
+ - "professional advice in specialized fields like law or medicine."
205
+ - "provide me with a coding solution for"
206
+ - "Academic research"
model-00001-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f1176d98768b8d0df72bbbba8e812118074f4e4ac3788a73855475b1fc08b8ae
3
+ size 9919813712
model-00002-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e78e25907148f0f76f6efaafa17727e5a46420b60af592f27ffdf326a84f0e75
3
+ size 9982454728
model-00003-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2673b8be00f990c778a34542dc0fae6477dde4b6c060f9d9ee17735c3da00b8b
3
+ size 9982454728
model-00004-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:afbc1d9a1cd897bd0d03d61e30d647c44459046b781740b8673830fcef4bf9b9
3
+ size 7148170232
model.safetensors.index.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"metadata": {"mergekit_version": "0.0.3.2"}, "weight_map": {"model.embed_tokens.weight": "model-00001-of-00004.safetensors", "model.norm.weight": "model-00001-of-00004.safetensors", "lm_head.weight": "model-00001-of-00004.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.1.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.2.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.3.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.4.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.5.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.6.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.7.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.8.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.9.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.10.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.11.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.12.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.13.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.14.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.15.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.16.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.17.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.18.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.19.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.20.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.21.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.22.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.23.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.24.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.25.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.26.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.27.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.28.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.29.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.30.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.31.input_layernorm.weight": "model-00001-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.0.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.1.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.2.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.3.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.4.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.5.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.6.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.7.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.8.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.9.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.10.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.11.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.12.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.13.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.14.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.15.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.16.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.17.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.1.w1.weight": "model-00003-of-00004.safetensors", "model.layers.18.block_sparse_moe.experts.2.w1.weight": "model-00003-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.0.w1.weight": "model-00003-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.19.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.20.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.21.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.22.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.23.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.24.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.25.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.26.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.27.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.28.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.29.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.30.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00004.safetensors", "model.layers.31.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00004.safetensors", "model.layers.0.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.1.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.2.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.3.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.4.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.5.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.6.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.7.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.8.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.9.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.10.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.11.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.12.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.13.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.14.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.15.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.16.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.17.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.18.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.19.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.20.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.21.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.22.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.23.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.24.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.25.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.26.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.27.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.28.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.29.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.30.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.31.post_attention_layernorm.weight": "model-00004-of-00004.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00004-of-00004.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00004-of-00004.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00004-of-00004.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.28.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.29.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.30.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.31.self_attn.o_proj.weight": "model-00004-of-00004.safetensors", "model.layers.0.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.1.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.2.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.3.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.4.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.5.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.6.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.7.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.8.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.9.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.10.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.11.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.12.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.13.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.14.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.15.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.16.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.17.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.18.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.19.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.20.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.21.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.22.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.23.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.24.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.25.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.26.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.27.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.28.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.29.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.30.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors", "model.layers.31.block_sparse_moe.gate.weight": "model-00004-of-00004.safetensors"}}
special_tokens_map.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<unk>",
4
+ "<s>",
5
+ "</s>"
6
+ ],
7
+ "bos_token": {
8
+ "content": "<s>",
9
+ "lstrip": false,
10
+ "normalized": false,
11
+ "rstrip": false,
12
+ "single_word": false
13
+ },
14
+ "eos_token": {
15
+ "content": "</s>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false
20
+ },
21
+ "pad_token": "<s>",
22
+ "unk_token": {
23
+ "content": "<unk>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": false,
27
+ "single_word": false
28
+ }
29
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dadfd56d766715c61d2ef780a525ab43b8e6da4de6865bda3d95fdef5e134055
3
+ size 493443
tokenizer_config.json ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<unk>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<s>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "</s>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ }
29
+ },
30
+ "additional_special_tokens": [
31
+ "<unk>",
32
+ "<s>",
33
+ "</s>"
34
+ ],
35
+ "bos_token": "<s>",
36
+ "clean_up_tokenization_spaces": false,
37
+ "eos_token": "</s>",
38
+ "legacy": true,
39
+ "model_max_length": 1000000000000000019884624838656,
40
+ "pad_token": "<s>",
41
+ "padding_side": "left",
42
+ "sp_model_kwargs": {},
43
+ "spaces_between_special_tokens": false,
44
+ "split_special_tokens": false,
45
+ "tokenizer_class": "LlamaTokenizer",
46
+ "unk_token": "<unk>",
47
+ "use_default_system_prompt": true
48
+ }