wwydmanski commited on
Commit
75da22d
1 Parent(s): 30d0c47

Upload folder using huggingface_hub

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,623 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:100006
8
+ - loss:CachedMultipleNegativesRankingLoss
9
+ base_model: answerdotai/ModernBERT-base
10
+ widget:
11
+ - source_sentence: how much weight can you lose in a week healthy?
12
+ sentences:
13
+ - Biology
14
+ - 'Summary: According to experts, losing 1–2 pounds (0.45–0.9 kg) per week is a
15
+ healthy and safe rate, while losing more than this is considered too fast. However,
16
+ you may lose more than that during your first week of an exercise or diet plan.'
17
+ - The number of valence electrons is the number of electrons in the outer shell,
18
+ that the atom uses for bonding. Nitrogen has 5 electrons in its n=2 (outer) shell.
19
+ - source_sentence: how long after having a baby can i get a tattoo?
20
+ sentences:
21
+ - It is suggested that mothers wait at least until 9-12 months after birth, when
22
+ the child is no longer dependent solely on breastmilk before getting a tattoo.
23
+ Reputable tattoo artists will have a waiver for the client to sign that asks about
24
+ pregnancy and breastfeeding.
25
+ - Medicine
26
+ - Americans on average are down to 44 gallons of soda per year, and up to about
27
+ 58 gallons of water. That's 7,242 ounces of water annually -- 20 ounces daily,
28
+ which is 2.5 cups.
29
+ - source_sentence: is all uhmw anti static?
30
+ sentences:
31
+ - The bacteria Streptococcus pyogenes causes it. It's most common in infants and
32
+ children, but it frequently occurs in teenagers and adults as well. It causes
33
+ white streaks or spots in the throat.
34
+ - Chemistry
35
+ - UHMW is available in a special anti-static grade that helps protect against EsD
36
+ (static discharge) or to help keep dust and particles from building up on the
37
+ product surface. The anti-static additives are built-in so the anti-static properties
38
+ will last throughout the life of the material.
39
+ - source_sentence: is closing cost tax deductible?
40
+ sentences:
41
+ - Medicine
42
+ - 1 tablespoon (tbsp) of granulated sugar equals to 12.5998 grams (g) in granulated
43
+ sugar mass.
44
+ - In general, the only settlement or closing costs you can deduct are home mortgage
45
+ interest and certain real estate taxes. You deduct them in the year you buy your
46
+ home if you itemize your deductions. ... See IRS Publication 530, "Tax Information
47
+ for Homeowners" and look for "Settlement or closing costs" for more details.
48
+ - source_sentence: what is the connection between cancer and the cell cycle?
49
+ sentences:
50
+ - Biology
51
+ - Conclusion. Cancer is unchecked cell growth. Mutations in genes can cause cancer
52
+ by accelerating cell division rates or inhibiting normal controls on the system,
53
+ such as cell cycle arrest or programmed cell death. As a mass of cancerous cells
54
+ grows, it can develop into a tumor.
55
+ - Your vomit may appear black if the blood has been oxidized by the acids in your
56
+ stomach. The iron in your blood turns brown to black with time. Since the blood
57
+ is no longer bright red, it means that the bleeding has either stopped or is only
58
+ happening in a small amount.
59
+ pipeline_tag: sentence-similarity
60
+ library_name: sentence-transformers
61
+ metrics:
62
+ - cosine_accuracy@1
63
+ - cosine_accuracy@3
64
+ - cosine_accuracy@5
65
+ - cosine_accuracy@10
66
+ - cosine_precision@1
67
+ - cosine_precision@3
68
+ - cosine_precision@5
69
+ - cosine_precision@10
70
+ - cosine_recall@1
71
+ - cosine_recall@3
72
+ - cosine_recall@5
73
+ - cosine_recall@10
74
+ - cosine_ndcg@10
75
+ - cosine_mrr@10
76
+ - cosine_map@100
77
+ model-index:
78
+ - name: SentenceTransformer based on answerdotai/ModernBERT-base
79
+ results:
80
+ - task:
81
+ type: information-retrieval
82
+ name: Information Retrieval
83
+ dataset:
84
+ name: NanoNQ
85
+ type: NanoNQ
86
+ metrics:
87
+ - type: cosine_accuracy@1
88
+ value: 0.1
89
+ name: Cosine Accuracy@1
90
+ - type: cosine_accuracy@3
91
+ value: 0.18
92
+ name: Cosine Accuracy@3
93
+ - type: cosine_accuracy@5
94
+ value: 0.24
95
+ name: Cosine Accuracy@5
96
+ - type: cosine_accuracy@10
97
+ value: 0.34
98
+ name: Cosine Accuracy@10
99
+ - type: cosine_precision@1
100
+ value: 0.1
101
+ name: Cosine Precision@1
102
+ - type: cosine_precision@3
103
+ value: 0.06
104
+ name: Cosine Precision@3
105
+ - type: cosine_precision@5
106
+ value: 0.04800000000000001
107
+ name: Cosine Precision@5
108
+ - type: cosine_precision@10
109
+ value: 0.034
110
+ name: Cosine Precision@10
111
+ - type: cosine_recall@1
112
+ value: 0.1
113
+ name: Cosine Recall@1
114
+ - type: cosine_recall@3
115
+ value: 0.15
116
+ name: Cosine Recall@3
117
+ - type: cosine_recall@5
118
+ value: 0.21
119
+ name: Cosine Recall@5
120
+ - type: cosine_recall@10
121
+ value: 0.31
122
+ name: Cosine Recall@10
123
+ - type: cosine_ndcg@10
124
+ value: 0.19343658524041285
125
+ name: Cosine Ndcg@10
126
+ - type: cosine_mrr@10
127
+ value: 0.16590476190476192
128
+ name: Cosine Mrr@10
129
+ - type: cosine_map@100
130
+ value: 0.17642959153410534
131
+ name: Cosine Map@100
132
+ - task:
133
+ type: information-retrieval
134
+ name: Information Retrieval
135
+ dataset:
136
+ name: NanoMSMARCO
137
+ type: NanoMSMARCO
138
+ metrics:
139
+ - type: cosine_accuracy@1
140
+ value: 0.12
141
+ name: Cosine Accuracy@1
142
+ - type: cosine_accuracy@3
143
+ value: 0.28
144
+ name: Cosine Accuracy@3
145
+ - type: cosine_accuracy@5
146
+ value: 0.4
147
+ name: Cosine Accuracy@5
148
+ - type: cosine_accuracy@10
149
+ value: 0.52
150
+ name: Cosine Accuracy@10
151
+ - type: cosine_precision@1
152
+ value: 0.12
153
+ name: Cosine Precision@1
154
+ - type: cosine_precision@3
155
+ value: 0.09333333333333332
156
+ name: Cosine Precision@3
157
+ - type: cosine_precision@5
158
+ value: 0.08
159
+ name: Cosine Precision@5
160
+ - type: cosine_precision@10
161
+ value: 0.052000000000000005
162
+ name: Cosine Precision@10
163
+ - type: cosine_recall@1
164
+ value: 0.12
165
+ name: Cosine Recall@1
166
+ - type: cosine_recall@3
167
+ value: 0.28
168
+ name: Cosine Recall@3
169
+ - type: cosine_recall@5
170
+ value: 0.4
171
+ name: Cosine Recall@5
172
+ - type: cosine_recall@10
173
+ value: 0.52
174
+ name: Cosine Recall@10
175
+ - type: cosine_ndcg@10
176
+ value: 0.2984940860938879
177
+ name: Cosine Ndcg@10
178
+ - type: cosine_mrr@10
179
+ value: 0.2304365079365079
180
+ name: Cosine Mrr@10
181
+ - type: cosine_map@100
182
+ value: 0.24691442502099614
183
+ name: Cosine Map@100
184
+ - task:
185
+ type: nano-beir
186
+ name: Nano BEIR
187
+ dataset:
188
+ name: NanoBEIR mean
189
+ type: NanoBEIR_mean
190
+ metrics:
191
+ - type: cosine_accuracy@1
192
+ value: 0.11
193
+ name: Cosine Accuracy@1
194
+ - type: cosine_accuracy@3
195
+ value: 0.23
196
+ name: Cosine Accuracy@3
197
+ - type: cosine_accuracy@5
198
+ value: 0.32
199
+ name: Cosine Accuracy@5
200
+ - type: cosine_accuracy@10
201
+ value: 0.43000000000000005
202
+ name: Cosine Accuracy@10
203
+ - type: cosine_precision@1
204
+ value: 0.11
205
+ name: Cosine Precision@1
206
+ - type: cosine_precision@3
207
+ value: 0.07666666666666666
208
+ name: Cosine Precision@3
209
+ - type: cosine_precision@5
210
+ value: 0.064
211
+ name: Cosine Precision@5
212
+ - type: cosine_precision@10
213
+ value: 0.043000000000000003
214
+ name: Cosine Precision@10
215
+ - type: cosine_recall@1
216
+ value: 0.11
217
+ name: Cosine Recall@1
218
+ - type: cosine_recall@3
219
+ value: 0.21500000000000002
220
+ name: Cosine Recall@3
221
+ - type: cosine_recall@5
222
+ value: 0.305
223
+ name: Cosine Recall@5
224
+ - type: cosine_recall@10
225
+ value: 0.41500000000000004
226
+ name: Cosine Recall@10
227
+ - type: cosine_ndcg@10
228
+ value: 0.24596533566715037
229
+ name: Cosine Ndcg@10
230
+ - type: cosine_mrr@10
231
+ value: 0.1981706349206349
232
+ name: Cosine Mrr@10
233
+ - type: cosine_map@100
234
+ value: 0.21167200827755073
235
+ name: Cosine Map@100
236
+ ---
237
+
238
+ # SentenceTransformer based on answerdotai/ModernBERT-base
239
+
240
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) on the csv dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
241
+
242
+ ## Model Details
243
+
244
+ ### Model Description
245
+ - **Model Type:** Sentence Transformer
246
+ - **Base model:** [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) <!-- at revision 5756c58a31a2478f9e62146021f48295a92c3da5 -->
247
+ - **Maximum Sequence Length:** 8192 tokens
248
+ - **Output Dimensionality:** 768 dimensions
249
+ - **Similarity Function:** Cosine Similarity
250
+ - **Training Dataset:**
251
+ - csv
252
+ <!-- - **Language:** Unknown -->
253
+ <!-- - **License:** Unknown -->
254
+
255
+ ### Model Sources
256
+
257
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
258
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
259
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
260
+
261
+ ### Full Model Architecture
262
+
263
+ ```
264
+ SentenceTransformer(
265
+ (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel
266
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
267
+ )
268
+ ```
269
+
270
+ ## Usage
271
+
272
+ ### Direct Usage (Sentence Transformers)
273
+
274
+ First install the Sentence Transformers library:
275
+
276
+ ```bash
277
+ pip install -U sentence-transformers
278
+ ```
279
+
280
+ Then you can load this model and run inference.
281
+ ```python
282
+ from sentence_transformers import SentenceTransformer
283
+
284
+ # Download from the 🤗 Hub
285
+ model = SentenceTransformer("sentence_transformers_model_id")
286
+ # Run inference
287
+ sentences = [
288
+ 'what is the connection between cancer and the cell cycle?',
289
+ 'Conclusion. Cancer is unchecked cell growth. Mutations in genes can cause cancer by accelerating cell division rates or inhibiting normal controls on the system, such as cell cycle arrest or programmed cell death. As a mass of cancerous cells grows, it can develop into a tumor.',
290
+ 'Biology',
291
+ ]
292
+ embeddings = model.encode(sentences)
293
+ print(embeddings.shape)
294
+ # [3, 768]
295
+
296
+ # Get the similarity scores for the embeddings
297
+ similarities = model.similarity(embeddings, embeddings)
298
+ print(similarities.shape)
299
+ # [3, 3]
300
+ ```
301
+
302
+ <!--
303
+ ### Direct Usage (Transformers)
304
+
305
+ <details><summary>Click to see the direct usage in Transformers</summary>
306
+
307
+ </details>
308
+ -->
309
+
310
+ <!--
311
+ ### Downstream Usage (Sentence Transformers)
312
+
313
+ You can finetune this model on your own dataset.
314
+
315
+ <details><summary>Click to expand</summary>
316
+
317
+ </details>
318
+ -->
319
+
320
+ <!--
321
+ ### Out-of-Scope Use
322
+
323
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
324
+ -->
325
+
326
+ ## Evaluation
327
+
328
+ ### Metrics
329
+
330
+ #### Information Retrieval
331
+
332
+ * Datasets: `NanoNQ` and `NanoMSMARCO`
333
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
334
+
335
+ | Metric | NanoNQ | NanoMSMARCO |
336
+ |:--------------------|:-----------|:------------|
337
+ | cosine_accuracy@1 | 0.1 | 0.12 |
338
+ | cosine_accuracy@3 | 0.18 | 0.28 |
339
+ | cosine_accuracy@5 | 0.24 | 0.4 |
340
+ | cosine_accuracy@10 | 0.34 | 0.52 |
341
+ | cosine_precision@1 | 0.1 | 0.12 |
342
+ | cosine_precision@3 | 0.06 | 0.0933 |
343
+ | cosine_precision@5 | 0.048 | 0.08 |
344
+ | cosine_precision@10 | 0.034 | 0.052 |
345
+ | cosine_recall@1 | 0.1 | 0.12 |
346
+ | cosine_recall@3 | 0.15 | 0.28 |
347
+ | cosine_recall@5 | 0.21 | 0.4 |
348
+ | cosine_recall@10 | 0.31 | 0.52 |
349
+ | **cosine_ndcg@10** | **0.1934** | **0.2985** |
350
+ | cosine_mrr@10 | 0.1659 | 0.2304 |
351
+ | cosine_map@100 | 0.1764 | 0.2469 |
352
+
353
+ #### Nano BEIR
354
+
355
+ * Dataset: `NanoBEIR_mean`
356
+ * Evaluated with [<code>NanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.NanoBEIREvaluator)
357
+
358
+ | Metric | Value |
359
+ |:--------------------|:----------|
360
+ | cosine_accuracy@1 | 0.11 |
361
+ | cosine_accuracy@3 | 0.23 |
362
+ | cosine_accuracy@5 | 0.32 |
363
+ | cosine_accuracy@10 | 0.43 |
364
+ | cosine_precision@1 | 0.11 |
365
+ | cosine_precision@3 | 0.0767 |
366
+ | cosine_precision@5 | 0.064 |
367
+ | cosine_precision@10 | 0.043 |
368
+ | cosine_recall@1 | 0.11 |
369
+ | cosine_recall@3 | 0.215 |
370
+ | cosine_recall@5 | 0.305 |
371
+ | cosine_recall@10 | 0.415 |
372
+ | **cosine_ndcg@10** | **0.246** |
373
+ | cosine_mrr@10 | 0.1982 |
374
+ | cosine_map@100 | 0.2117 |
375
+
376
+ <!--
377
+ ## Bias, Risks and Limitations
378
+
379
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
380
+ -->
381
+
382
+ <!--
383
+ ### Recommendations
384
+
385
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
386
+ -->
387
+
388
+ ## Training Details
389
+
390
+ ### Training Dataset
391
+
392
+ #### csv
393
+
394
+ * Dataset: csv
395
+ * Size: 100,006 training samples
396
+ * Columns: <code>question</code>, <code>answer</code>, and <code>category</code>
397
+ * Approximate statistics based on the first 1000 samples:
398
+ | | question | answer | category |
399
+ |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------|
400
+ | type | string | string | string |
401
+ | details | <ul><li>min: 8 tokens</li><li>mean: 11.91 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 57.49 tokens</li><li>max: 136 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 4.0 tokens</li><li>max: 4 tokens</li></ul> |
402
+ * Samples:
403
+ | question | answer | category |
404
+ |:---------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------|
405
+ | <code>how many times a week should you use heat on your hair?</code> | <code>Don't style hair with heat every day. Hot tools can also make hair look crispy and create split ends if overused. Blow out hair 3-5 times a week and try to limit your flat iron/curling iron usage to 1-2 times a week.”</code> | <code>Medicine</code> |
406
+ | <code>do african violets like to be root bound?</code> | <code>African violets only bloom when they're root bound. When it is time to repot, be sure to use an organic potting soil made specifically for African violets, such as Espoma's African Violet Mix. They flower best in small pots — choose one that's about a third of the diameter of their leaf spread.</code> | <code>Biology</code> |
407
+ | <code>is pgwp exempt from lmia?</code> | <code>The PGWP is exempt from Labour Market Impact Assessment (LMIA) requirements. The candidate must have attended a recognized post-secondary school, or a secondary school that offers qualifying programs, for at least eight months.</code> | <code>Medicine</code> |
408
+ * Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters:
409
+ ```json
410
+ {
411
+ "scale": 20.0,
412
+ "similarity_fct": "cos_sim"
413
+ }
414
+ ```
415
+
416
+ ### Training Hyperparameters
417
+ #### Non-Default Hyperparameters
418
+
419
+ - `eval_strategy`: steps
420
+ - `per_device_train_batch_size`: 256
421
+ - `per_device_eval_batch_size`: 256
422
+ - `learning_rate`: 0.0001
423
+ - `num_train_epochs`: 1
424
+ - `warmup_ratio`: 0.05
425
+ - `bf16`: True
426
+ - `batch_sampler`: no_duplicates
427
+
428
+ #### All Hyperparameters
429
+ <details><summary>Click to expand</summary>
430
+
431
+ - `overwrite_output_dir`: False
432
+ - `do_predict`: False
433
+ - `eval_strategy`: steps
434
+ - `prediction_loss_only`: True
435
+ - `per_device_train_batch_size`: 256
436
+ - `per_device_eval_batch_size`: 256
437
+ - `per_gpu_train_batch_size`: None
438
+ - `per_gpu_eval_batch_size`: None
439
+ - `gradient_accumulation_steps`: 1
440
+ - `eval_accumulation_steps`: None
441
+ - `torch_empty_cache_steps`: None
442
+ - `learning_rate`: 0.0001
443
+ - `weight_decay`: 0.0
444
+ - `adam_beta1`: 0.9
445
+ - `adam_beta2`: 0.999
446
+ - `adam_epsilon`: 1e-08
447
+ - `max_grad_norm`: 1.0
448
+ - `num_train_epochs`: 1
449
+ - `max_steps`: -1
450
+ - `lr_scheduler_type`: linear
451
+ - `lr_scheduler_kwargs`: {}
452
+ - `warmup_ratio`: 0.05
453
+ - `warmup_steps`: 0
454
+ - `log_level`: passive
455
+ - `log_level_replica`: warning
456
+ - `log_on_each_node`: True
457
+ - `logging_nan_inf_filter`: True
458
+ - `save_safetensors`: True
459
+ - `save_on_each_node`: False
460
+ - `save_only_model`: False
461
+ - `restore_callback_states_from_checkpoint`: False
462
+ - `no_cuda`: False
463
+ - `use_cpu`: False
464
+ - `use_mps_device`: False
465
+ - `seed`: 42
466
+ - `data_seed`: None
467
+ - `jit_mode_eval`: False
468
+ - `use_ipex`: False
469
+ - `bf16`: True
470
+ - `fp16`: False
471
+ - `fp16_opt_level`: O1
472
+ - `half_precision_backend`: auto
473
+ - `bf16_full_eval`: False
474
+ - `fp16_full_eval`: False
475
+ - `tf32`: None
476
+ - `local_rank`: 0
477
+ - `ddp_backend`: None
478
+ - `tpu_num_cores`: None
479
+ - `tpu_metrics_debug`: False
480
+ - `debug`: []
481
+ - `dataloader_drop_last`: False
482
+ - `dataloader_num_workers`: 0
483
+ - `dataloader_prefetch_factor`: None
484
+ - `past_index`: -1
485
+ - `disable_tqdm`: False
486
+ - `remove_unused_columns`: True
487
+ - `label_names`: None
488
+ - `load_best_model_at_end`: False
489
+ - `ignore_data_skip`: False
490
+ - `fsdp`: []
491
+ - `fsdp_min_num_params`: 0
492
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
493
+ - `fsdp_transformer_layer_cls_to_wrap`: None
494
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
495
+ - `deepspeed`: None
496
+ - `label_smoothing_factor`: 0.0
497
+ - `optim`: adamw_torch
498
+ - `optim_args`: None
499
+ - `adafactor`: False
500
+ - `group_by_length`: False
501
+ - `length_column_name`: length
502
+ - `ddp_find_unused_parameters`: None
503
+ - `ddp_bucket_cap_mb`: None
504
+ - `ddp_broadcast_buffers`: False
505
+ - `dataloader_pin_memory`: True
506
+ - `dataloader_persistent_workers`: False
507
+ - `skip_memory_metrics`: True
508
+ - `use_legacy_prediction_loop`: False
509
+ - `push_to_hub`: False
510
+ - `resume_from_checkpoint`: None
511
+ - `hub_model_id`: None
512
+ - `hub_strategy`: every_save
513
+ - `hub_private_repo`: None
514
+ - `hub_always_push`: False
515
+ - `gradient_checkpointing`: False
516
+ - `gradient_checkpointing_kwargs`: None
517
+ - `include_inputs_for_metrics`: False
518
+ - `include_for_metrics`: []
519
+ - `eval_do_concat_batches`: True
520
+ - `fp16_backend`: auto
521
+ - `push_to_hub_model_id`: None
522
+ - `push_to_hub_organization`: None
523
+ - `mp_parameters`:
524
+ - `auto_find_batch_size`: False
525
+ - `full_determinism`: False
526
+ - `torchdynamo`: None
527
+ - `ray_scope`: last
528
+ - `ddp_timeout`: 1800
529
+ - `torch_compile`: False
530
+ - `torch_compile_backend`: None
531
+ - `torch_compile_mode`: None
532
+ - `dispatch_batches`: None
533
+ - `split_batches`: None
534
+ - `include_tokens_per_second`: False
535
+ - `include_num_input_tokens_seen`: False
536
+ - `neftune_noise_alpha`: None
537
+ - `optim_target_modules`: None
538
+ - `batch_eval_metrics`: False
539
+ - `eval_on_start`: False
540
+ - `use_liger_kernel`: False
541
+ - `eval_use_gather_object`: False
542
+ - `average_tokens_across_devices`: False
543
+ - `prompts`: None
544
+ - `batch_sampler`: no_duplicates
545
+ - `multi_dataset_batch_sampler`: proportional
546
+
547
+ </details>
548
+
549
+ ### Training Logs
550
+ | Epoch | Step | Training Loss | NanoNQ_cosine_ndcg@10 | NanoMSMARCO_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
551
+ |:------:|:----:|:-------------:|:---------------------:|:--------------------------:|:----------------------------:|
552
+ | 0 | 0 | - | 0.0388 | 0.0863 | 0.0626 |
553
+ | 0.0763 | 10 | 0.5482 | - | - | - |
554
+ | 0.1527 | 20 | 0.1079 | - | - | - |
555
+ | 0.2290 | 30 | 0.1491 | - | - | - |
556
+ | 0.3053 | 40 | 0.1381 | - | - | - |
557
+ | 0.3817 | 50 | 0.0873 | 0.0909 | 0.2197 | 0.1553 |
558
+ | 0.4580 | 60 | 0.133 | - | - | - |
559
+ | 0.5344 | 70 | 0.0539 | - | - | - |
560
+ | 0.6107 | 80 | 0.029 | - | - | - |
561
+ | 0.6870 | 90 | 0.0008 | - | - | - |
562
+ | 0.7634 | 100 | 0.0997 | 0.1982 | 0.2657 | 0.2320 |
563
+ | 0.8397 | 110 | 0.04 | - | - | - |
564
+ | 0.9160 | 120 | 0.0053 | - | - | - |
565
+ | 0.9924 | 130 | 0.0095 | - | - | - |
566
+ | 1.0 | 131 | - | 0.1934 | 0.2985 | 0.2460 |
567
+
568
+
569
+ ### Framework Versions
570
+ - Python: 3.12.3
571
+ - Sentence Transformers: 3.3.1
572
+ - Transformers: 4.48.0.dev0
573
+ - PyTorch: 2.5.1
574
+ - Accelerate: 1.2.1
575
+ - Datasets: 3.2.0
576
+ - Tokenizers: 0.21.0
577
+
578
+ ## Citation
579
+
580
+ ### BibTeX
581
+
582
+ #### Sentence Transformers
583
+ ```bibtex
584
+ @inproceedings{reimers-2019-sentence-bert,
585
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
586
+ author = "Reimers, Nils and Gurevych, Iryna",
587
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
588
+ month = "11",
589
+ year = "2019",
590
+ publisher = "Association for Computational Linguistics",
591
+ url = "https://arxiv.org/abs/1908.10084",
592
+ }
593
+ ```
594
+
595
+ #### CachedMultipleNegativesRankingLoss
596
+ ```bibtex
597
+ @misc{gao2021scaling,
598
+ title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
599
+ author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
600
+ year={2021},
601
+ eprint={2101.06983},
602
+ archivePrefix={arXiv},
603
+ primaryClass={cs.LG}
604
+ }
605
+ ```
606
+
607
+ <!--
608
+ ## Glossary
609
+
610
+ *Clearly define terms in order to be accessible across audiences.*
611
+ -->
612
+
613
+ <!--
614
+ ## Model Card Authors
615
+
616
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
617
+ -->
618
+
619
+ <!--
620
+ ## Model Card Contact
621
+
622
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
623
+ -->
config.json ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "answerdotai/ModernBERT-base",
3
+ "architectures": [
4
+ "ModernBertModel"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 50281,
9
+ "classifier_activation": "gelu",
10
+ "classifier_bias": false,
11
+ "classifier_dropout": 0.0,
12
+ "classifier_pooling": "mean",
13
+ "cls_token_id": 50281,
14
+ "decoder_bias": true,
15
+ "deterministic_flash_attn": false,
16
+ "embedding_dropout": 0.0,
17
+ "eos_token_id": 50282,
18
+ "global_attn_every_n_layers": 3,
19
+ "global_rope_theta": 160000.0,
20
+ "gradient_checkpointing": false,
21
+ "hidden_activation": "gelu",
22
+ "hidden_size": 768,
23
+ "initializer_cutoff_factor": 2.0,
24
+ "initializer_range": 0.02,
25
+ "intermediate_size": 1152,
26
+ "layer_norm_eps": 1e-05,
27
+ "local_attention": 128,
28
+ "local_rope_theta": 10000.0,
29
+ "max_position_embeddings": 8192,
30
+ "mlp_bias": false,
31
+ "mlp_dropout": 0.0,
32
+ "model_type": "modernbert",
33
+ "norm_bias": false,
34
+ "norm_eps": 1e-05,
35
+ "num_attention_heads": 12,
36
+ "num_hidden_layers": 22,
37
+ "pad_token_id": 50283,
38
+ "position_embedding_type": "absolute",
39
+ "reference_compile": true,
40
+ "sep_token_id": 50282,
41
+ "sparse_pred_ignore_index": -100,
42
+ "sparse_prediction": false,
43
+ "torch_dtype": "float32",
44
+ "transformers_version": "4.48.0.dev0",
45
+ "vocab_size": 50368
46
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.1",
4
+ "transformers": "4.48.0.dev0",
5
+ "pytorch": "2.5.1"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dd433720e592284eff6ff5505b5b8fc7c4aa9c23111e844d783ceb5bab0b03a1
3
+ size 596070136
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 8192,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "|||IP_ADDRESS|||",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": false
10
+ },
11
+ "1": {
12
+ "content": "<|padding|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "50254": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "50255": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "50256": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "50257": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "50258": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "50259": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "50260": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "50261": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "50262": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "50263": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "50264": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "50265": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "50266": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "50267": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "50268": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "50269": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "50270": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "50271": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "50272": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "50273": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "50274": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "50275": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "50276": {
196
+ "content": " ",
197
+ "lstrip": false,
198
+ "normalized": true,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": false
202
+ },
203
+ "50277": {
204
+ "content": "|||EMAIL_ADDRESS|||",
205
+ "lstrip": false,
206
+ "normalized": true,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": false
210
+ },
211
+ "50278": {
212
+ "content": "|||PHONE_NUMBER|||",
213
+ "lstrip": false,
214
+ "normalized": true,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": false
218
+ },
219
+ "50279": {
220
+ "content": "<|endoftext|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "50280": {
228
+ "content": "[UNK]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "50281": {
236
+ "content": "[CLS]",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "50282": {
244
+ "content": "[SEP]",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "50283": {
252
+ "content": "[PAD]",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "50284": {
260
+ "content": "[MASK]",
261
+ "lstrip": true,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "50285": {
268
+ "content": "[unused0]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "50286": {
276
+ "content": "[unused1]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "50287": {
284
+ "content": "[unused2]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "50288": {
292
+ "content": "[unused3]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "50289": {
300
+ "content": "[unused4]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "50290": {
308
+ "content": "[unused5]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "50291": {
316
+ "content": "[unused6]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "50292": {
324
+ "content": "[unused7]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "50293": {
332
+ "content": "[unused8]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "50294": {
340
+ "content": "[unused9]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "50295": {
348
+ "content": "[unused10]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "50296": {
356
+ "content": "[unused11]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "50297": {
364
+ "content": "[unused12]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "50298": {
372
+ "content": "[unused13]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "50299": {
380
+ "content": "[unused14]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "50300": {
388
+ "content": "[unused15]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "50301": {
396
+ "content": "[unused16]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "50302": {
404
+ "content": "[unused17]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "50303": {
412
+ "content": "[unused18]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "50304": {
420
+ "content": "[unused19]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "50305": {
428
+ "content": "[unused20]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "50306": {
436
+ "content": "[unused21]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "50307": {
444
+ "content": "[unused22]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "50308": {
452
+ "content": "[unused23]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "50309": {
460
+ "content": "[unused24]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "50310": {
468
+ "content": "[unused25]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "50311": {
476
+ "content": "[unused26]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "50312": {
484
+ "content": "[unused27]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "50313": {
492
+ "content": "[unused28]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "50314": {
500
+ "content": "[unused29]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "50315": {
508
+ "content": "[unused30]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "50316": {
516
+ "content": "[unused31]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "50317": {
524
+ "content": "[unused32]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "50318": {
532
+ "content": "[unused33]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "50319": {
540
+ "content": "[unused34]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "50320": {
548
+ "content": "[unused35]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "50321": {
556
+ "content": "[unused36]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "50322": {
564
+ "content": "[unused37]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "50323": {
572
+ "content": "[unused38]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "50324": {
580
+ "content": "[unused39]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "50325": {
588
+ "content": "[unused40]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "50326": {
596
+ "content": "[unused41]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "50327": {
604
+ "content": "[unused42]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "50328": {
612
+ "content": "[unused43]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "50329": {
620
+ "content": "[unused44]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "50330": {
628
+ "content": "[unused45]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "50331": {
636
+ "content": "[unused46]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "50332": {
644
+ "content": "[unused47]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "50333": {
652
+ "content": "[unused48]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "50334": {
660
+ "content": "[unused49]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "50335": {
668
+ "content": "[unused50]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "50336": {
676
+ "content": "[unused51]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "50337": {
684
+ "content": "[unused52]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "50338": {
692
+ "content": "[unused53]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "50339": {
700
+ "content": "[unused54]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "50340": {
708
+ "content": "[unused55]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "50341": {
716
+ "content": "[unused56]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "50342": {
724
+ "content": "[unused57]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "50343": {
732
+ "content": "[unused58]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "50344": {
740
+ "content": "[unused59]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "50345": {
748
+ "content": "[unused60]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "50346": {
756
+ "content": "[unused61]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "50347": {
764
+ "content": "[unused62]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "50348": {
772
+ "content": "[unused63]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "50349": {
780
+ "content": "[unused64]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "50350": {
788
+ "content": "[unused65]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "50351": {
796
+ "content": "[unused66]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "50352": {
804
+ "content": "[unused67]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "50353": {
812
+ "content": "[unused68]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "50354": {
820
+ "content": "[unused69]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "50355": {
828
+ "content": "[unused70]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "50356": {
836
+ "content": "[unused71]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "50357": {
844
+ "content": "[unused72]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "50358": {
852
+ "content": "[unused73]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "50359": {
860
+ "content": "[unused74]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "50360": {
868
+ "content": "[unused75]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "50361": {
876
+ "content": "[unused76]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "50362": {
884
+ "content": "[unused77]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "50363": {
892
+ "content": "[unused78]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "50364": {
900
+ "content": "[unused79]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "50365": {
908
+ "content": "[unused80]",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "50366": {
916
+ "content": "[unused81]",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "50367": {
924
+ "content": "[unused82]",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ }
931
+ },
932
+ "clean_up_tokenization_spaces": true,
933
+ "cls_token": "[CLS]",
934
+ "extra_special_tokens": {},
935
+ "mask_token": "[MASK]",
936
+ "model_input_names": [
937
+ "input_ids",
938
+ "attention_mask"
939
+ ],
940
+ "model_max_length": 1000000000000000019884624838656,
941
+ "pad_token": "[PAD]",
942
+ "sep_token": "[SEP]",
943
+ "tokenizer_class": "PreTrainedTokenizerFast",
944
+ "unk_token": "[UNK]"
945
+ }