Fizzarolli commited on
Commit
335f4fa
1 Parent(s): e18eaf0

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,1027 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - sentence-transformers
6
+ - sentence-similarity
7
+ - feature-extraction
8
+ - generated_from_trainer
9
+ - dataset_size:557850
10
+ - loss:MatryoshkaLoss
11
+ - loss:MultipleNegativesRankingLoss
12
+ base_model: estrogen/ModernBERT-base-sbert-initialized
13
+ widget:
14
+ - source_sentence: A construction worker is standing on a crane placing a large arm
15
+ on top of a stature in progress.
16
+ sentences:
17
+ - A man is playing with his camera.
18
+ - A person standing
19
+ - Nobody is standing
20
+ - source_sentence: A boy in red slides down an inflatable ride.
21
+ sentences:
22
+ - a baby smiling
23
+ - A boy is playing on an inflatable ride.
24
+ - A boy pierces a knife through an inflatable ride.
25
+ - source_sentence: A man in a black shirt is playing a guitar.
26
+ sentences:
27
+ - A group of women are selling their wares
28
+ - The man is wearing black.
29
+ - The man is wearing a blue shirt.
30
+ - source_sentence: A man with a large power drill standing next to his daughter with
31
+ a vacuum cleaner hose.
32
+ sentences:
33
+ - A man holding a drill stands next to a girl holding a vacuum hose.
34
+ - Kids ride an amusement ride.
35
+ - The man and girl are painting the walls.
36
+ - source_sentence: A middle-aged man works under the engine of a train on rail tracks.
37
+ sentences:
38
+ - A guy is working on a train.
39
+ - Two young asian men are squatting.
40
+ - A guy is driving to work.
41
+ datasets:
42
+ - sentence-transformers/all-nli
43
+ pipeline_tag: sentence-similarity
44
+ library_name: sentence-transformers
45
+ metrics:
46
+ - pearson_cosine
47
+ - spearman_cosine
48
+ model-index:
49
+ - name: SentenceTransformer based on estrogen/ModernBERT-base-sbert-initialized
50
+ results:
51
+ - task:
52
+ type: semantic-similarity
53
+ name: Semantic Similarity
54
+ dataset:
55
+ name: sts dev
56
+ type: sts-dev
57
+ metrics:
58
+ - type: pearson_cosine
59
+ value: 0.8601586939371598
60
+ name: Pearson Cosine
61
+ - type: spearman_cosine
62
+ value: 0.8650559283517015
63
+ name: Spearman Cosine
64
+ - task:
65
+ type: semantic-similarity
66
+ name: Semantic Similarity
67
+ dataset:
68
+ name: sts test
69
+ type: sts-test
70
+ metrics:
71
+ - type: pearson_cosine
72
+ value: 0.8483904083763342
73
+ name: Pearson Cosine
74
+ - type: spearman_cosine
75
+ value: 0.8504558364206114
76
+ name: Spearman Cosine
77
+ ---
78
+
79
+ # SentenceTransformer based on estrogen/ModernBERT-base-sbert-initialized
80
+
81
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [estrogen/ModernBERT-base-sbert-initialized](https://huggingface.co/estrogen/ModernBERT-base-sbert-initialized) on the [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
82
+
83
+ ## Model Details
84
+
85
+ ### Model Description
86
+ - **Model Type:** Sentence Transformer
87
+ - **Base model:** [estrogen/ModernBERT-base-sbert-initialized](https://huggingface.co/estrogen/ModernBERT-base-sbert-initialized) <!-- at revision d80f2f10df59065d673fa4d9ef890aae3cbf4b68 -->
88
+ - **Maximum Sequence Length:** 8192 tokens
89
+ - **Output Dimensionality:** 768 dimensions
90
+ - **Similarity Function:** Cosine Similarity
91
+ - **Training Dataset:**
92
+ - [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli)
93
+ - **Language:** en
94
+ <!-- - **License:** Unknown -->
95
+
96
+ ### Model Sources
97
+
98
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
99
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
100
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
101
+
102
+ ### Full Model Architecture
103
+
104
+ ```
105
+ SentenceTransformer(
106
+ (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel
107
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
108
+ )
109
+ ```
110
+
111
+ ## Usage
112
+
113
+ ### Direct Usage (Sentence Transformers)
114
+
115
+ First install the Sentence Transformers library:
116
+
117
+ ```bash
118
+ pip install -U sentence-transformers
119
+ ```
120
+
121
+ Then you can load this model and run inference.
122
+ ```python
123
+ from sentence_transformers import SentenceTransformer
124
+
125
+ # Download from the 🤗 Hub
126
+ model = SentenceTransformer("estrogen/ModernBERT-base-nli-v3")
127
+ # Run inference
128
+ sentences = [
129
+ 'A middle-aged man works under the engine of a train on rail tracks.',
130
+ 'A guy is working on a train.',
131
+ 'A guy is driving to work.',
132
+ ]
133
+ embeddings = model.encode(sentences)
134
+ print(embeddings.shape)
135
+ # [3, 768]
136
+
137
+ # Get the similarity scores for the embeddings
138
+ similarities = model.similarity(embeddings, embeddings)
139
+ print(similarities.shape)
140
+ # [3, 3]
141
+ ```
142
+
143
+ <!--
144
+ ### Direct Usage (Transformers)
145
+
146
+ <details><summary>Click to see the direct usage in Transformers</summary>
147
+
148
+ </details>
149
+ -->
150
+
151
+ <!--
152
+ ### Downstream Usage (Sentence Transformers)
153
+
154
+ You can finetune this model on your own dataset.
155
+
156
+ <details><summary>Click to expand</summary>
157
+
158
+ </details>
159
+ -->
160
+
161
+ <!--
162
+ ### Out-of-Scope Use
163
+
164
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
165
+ -->
166
+
167
+ ## Evaluation
168
+
169
+ ### Metrics
170
+
171
+ #### Semantic Similarity
172
+
173
+ * Datasets: `sts-dev` and `sts-test`
174
+ * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
175
+
176
+ | Metric | sts-dev | sts-test |
177
+ |:--------------------|:-----------|:-----------|
178
+ | pearson_cosine | 0.8602 | 0.8484 |
179
+ | **spearman_cosine** | **0.8651** | **0.8505** |
180
+
181
+ <!--
182
+ ## Bias, Risks and Limitations
183
+
184
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
185
+ -->
186
+
187
+ <!--
188
+ ### Recommendations
189
+
190
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
191
+ -->
192
+
193
+ ## Training Details
194
+
195
+ ### Training Dataset
196
+
197
+ #### all-nli
198
+
199
+ * Dataset: [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
200
+ * Size: 557,850 training samples
201
+ * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
202
+ * Approximate statistics based on the first 1000 samples:
203
+ | | anchor | positive | negative |
204
+ |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
205
+ | type | string | string | string |
206
+ | details | <ul><li>min: 7 tokens</li><li>mean: 10.46 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 12.91 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 13.49 tokens</li><li>max: 51 tokens</li></ul> |
207
+ * Samples:
208
+ | anchor | positive | negative |
209
+ |:---------------------------------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------------|
210
+ | <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is outdoors, on a horse.</code> | <code>A person is at a diner, ordering an omelette.</code> |
211
+ | <code>Children smiling and waving at camera</code> | <code>There are children present</code> | <code>The kids are frowning</code> |
212
+ | <code>A boy is jumping on skateboard in the middle of a red bridge.</code> | <code>The boy does a skateboarding trick.</code> | <code>The boy skates down the sidewalk.</code> |
213
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
214
+ ```json
215
+ {
216
+ "loss": "MultipleNegativesRankingLoss",
217
+ "matryoshka_dims": [
218
+ 768,
219
+ 512,
220
+ 256,
221
+ 128,
222
+ 64
223
+ ],
224
+ "matryoshka_weights": [
225
+ 1,
226
+ 1,
227
+ 1,
228
+ 1,
229
+ 1
230
+ ],
231
+ "n_dims_per_step": -1
232
+ }
233
+ ```
234
+
235
+ ### Evaluation Dataset
236
+
237
+ #### all-nli
238
+
239
+ * Dataset: [all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
240
+ * Size: 6,584 evaluation samples
241
+ * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
242
+ * Approximate statistics based on the first 1000 samples:
243
+ | | anchor | positive | negative |
244
+ |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
245
+ | type | string | string | string |
246
+ | details | <ul><li>min: 6 tokens</li><li>mean: 18.25 tokens</li><li>max: 69 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 9.88 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.48 tokens</li><li>max: 29 tokens</li></ul> |
247
+ * Samples:
248
+ | anchor | positive | negative |
249
+ |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------|:--------------------------------------------------------|
250
+ | <code>Two women are embracing while holding to go packages.</code> | <code>Two woman are holding packages.</code> | <code>The men are fighting outside a deli.</code> |
251
+ | <code>Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink.</code> | <code>Two kids in numbered jerseys wash their hands.</code> | <code>Two kids in jackets walk to school.</code> |
252
+ | <code>A man selling donuts to a customer during a world exhibition event held in the city of Angeles</code> | <code>A man selling donuts to a customer.</code> | <code>A woman drinks her coffee in a small cafe.</code> |
253
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
254
+ ```json
255
+ {
256
+ "loss": "MultipleNegativesRankingLoss",
257
+ "matryoshka_dims": [
258
+ 768,
259
+ 512,
260
+ 256,
261
+ 128,
262
+ 64
263
+ ],
264
+ "matryoshka_weights": [
265
+ 1,
266
+ 1,
267
+ 1,
268
+ 1,
269
+ 1
270
+ ],
271
+ "n_dims_per_step": -1
272
+ }
273
+ ```
274
+
275
+ ### Training Hyperparameters
276
+ #### Non-Default Hyperparameters
277
+
278
+ - `eval_strategy`: steps
279
+ - `per_device_train_batch_size`: 1024
280
+ - `per_device_eval_batch_size`: 1024
281
+ - `num_train_epochs`: 1
282
+ - `warmup_ratio`: 0.1
283
+ - `bf16`: True
284
+ - `batch_sampler`: no_duplicates
285
+
286
+ #### All Hyperparameters
287
+ <details><summary>Click to expand</summary>
288
+
289
+ - `overwrite_output_dir`: False
290
+ - `do_predict`: False
291
+ - `eval_strategy`: steps
292
+ - `prediction_loss_only`: True
293
+ - `per_device_train_batch_size`: 1024
294
+ - `per_device_eval_batch_size`: 1024
295
+ - `per_gpu_train_batch_size`: None
296
+ - `per_gpu_eval_batch_size`: None
297
+ - `gradient_accumulation_steps`: 1
298
+ - `eval_accumulation_steps`: None
299
+ - `torch_empty_cache_steps`: None
300
+ - `learning_rate`: 5e-05
301
+ - `weight_decay`: 0.0
302
+ - `adam_beta1`: 0.9
303
+ - `adam_beta2`: 0.999
304
+ - `adam_epsilon`: 1e-08
305
+ - `max_grad_norm`: 1.0
306
+ - `num_train_epochs`: 1
307
+ - `max_steps`: -1
308
+ - `lr_scheduler_type`: linear
309
+ - `lr_scheduler_kwargs`: {}
310
+ - `warmup_ratio`: 0.1
311
+ - `warmup_steps`: 0
312
+ - `log_level`: passive
313
+ - `log_level_replica`: warning
314
+ - `log_on_each_node`: True
315
+ - `logging_nan_inf_filter`: True
316
+ - `save_safetensors`: True
317
+ - `save_on_each_node`: False
318
+ - `save_only_model`: False
319
+ - `restore_callback_states_from_checkpoint`: False
320
+ - `no_cuda`: False
321
+ - `use_cpu`: False
322
+ - `use_mps_device`: False
323
+ - `seed`: 42
324
+ - `data_seed`: None
325
+ - `jit_mode_eval`: False
326
+ - `use_ipex`: False
327
+ - `bf16`: True
328
+ - `fp16`: False
329
+ - `fp16_opt_level`: O1
330
+ - `half_precision_backend`: auto
331
+ - `bf16_full_eval`: False
332
+ - `fp16_full_eval`: False
333
+ - `tf32`: None
334
+ - `local_rank`: 0
335
+ - `ddp_backend`: None
336
+ - `tpu_num_cores`: None
337
+ - `tpu_metrics_debug`: False
338
+ - `debug`: []
339
+ - `dataloader_drop_last`: False
340
+ - `dataloader_num_workers`: 0
341
+ - `dataloader_prefetch_factor`: None
342
+ - `past_index`: -1
343
+ - `disable_tqdm`: False
344
+ - `remove_unused_columns`: True
345
+ - `label_names`: None
346
+ - `load_best_model_at_end`: False
347
+ - `ignore_data_skip`: False
348
+ - `fsdp`: []
349
+ - `fsdp_min_num_params`: 0
350
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
351
+ - `fsdp_transformer_layer_cls_to_wrap`: None
352
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
353
+ - `deepspeed`: None
354
+ - `label_smoothing_factor`: 0.0
355
+ - `optim`: adamw_torch
356
+ - `optim_args`: None
357
+ - `adafactor`: False
358
+ - `group_by_length`: False
359
+ - `length_column_name`: length
360
+ - `ddp_find_unused_parameters`: None
361
+ - `ddp_bucket_cap_mb`: None
362
+ - `ddp_broadcast_buffers`: False
363
+ - `dataloader_pin_memory`: True
364
+ - `dataloader_persistent_workers`: False
365
+ - `skip_memory_metrics`: True
366
+ - `use_legacy_prediction_loop`: False
367
+ - `push_to_hub`: False
368
+ - `resume_from_checkpoint`: None
369
+ - `hub_model_id`: None
370
+ - `hub_strategy`: every_save
371
+ - `hub_private_repo`: None
372
+ - `hub_always_push`: False
373
+ - `gradient_checkpointing`: False
374
+ - `gradient_checkpointing_kwargs`: None
375
+ - `include_inputs_for_metrics`: False
376
+ - `include_for_metrics`: []
377
+ - `eval_do_concat_batches`: True
378
+ - `fp16_backend`: auto
379
+ - `push_to_hub_model_id`: None
380
+ - `push_to_hub_organization`: None
381
+ - `mp_parameters`:
382
+ - `auto_find_batch_size`: False
383
+ - `full_determinism`: False
384
+ - `torchdynamo`: None
385
+ - `ray_scope`: last
386
+ - `ddp_timeout`: 1800
387
+ - `torch_compile`: False
388
+ - `torch_compile_backend`: None
389
+ - `torch_compile_mode`: None
390
+ - `dispatch_batches`: None
391
+ - `split_batches`: None
392
+ - `include_tokens_per_second`: False
393
+ - `include_num_input_tokens_seen`: False
394
+ - `neftune_noise_alpha`: None
395
+ - `optim_target_modules`: None
396
+ - `batch_eval_metrics`: False
397
+ - `eval_on_start`: False
398
+ - `use_liger_kernel`: False
399
+ - `eval_use_gather_object`: False
400
+ - `average_tokens_across_devices`: False
401
+ - `prompts`: None
402
+ - `batch_sampler`: no_duplicates
403
+ - `multi_dataset_batch_sampler`: proportional
404
+
405
+ </details>
406
+
407
+ ### Training Logs
408
+ <details><summary>Click to expand</summary>
409
+
410
+ | Epoch | Step | Training Loss | Validation Loss | sts-dev_spearman_cosine | sts-test_spearman_cosine |
411
+ |:------:|:----:|:-------------:|:---------------:|:-----------------------:|:------------------------:|
412
+ | 0 | 0 | - | - | 0.5576 | - |
413
+ | 0.0018 | 1 | 36.2556 | - | - | - |
414
+ | 0.0037 | 2 | 36.6329 | - | - | - |
415
+ | 0.0055 | 3 | 36.9705 | - | - | - |
416
+ | 0.0073 | 4 | 36.9173 | - | - | - |
417
+ | 0.0092 | 5 | 36.8254 | - | - | - |
418
+ | 0.0110 | 6 | 36.7313 | - | - | - |
419
+ | 0.0128 | 7 | 36.5865 | - | - | - |
420
+ | 0.0147 | 8 | 36.1709 | - | - | - |
421
+ | 0.0165 | 9 | 36.0519 | - | - | - |
422
+ | 0.0183 | 10 | 35.712 | - | - | - |
423
+ | 0.0202 | 11 | 35.4072 | - | - | - |
424
+ | 0.0220 | 12 | 35.0623 | - | - | - |
425
+ | 0.0239 | 13 | 34.6996 | - | - | - |
426
+ | 0.0257 | 14 | 34.2426 | - | - | - |
427
+ | 0.0275 | 15 | 33.6913 | - | - | - |
428
+ | 0.0294 | 16 | 33.2808 | - | - | - |
429
+ | 0.0312 | 17 | 32.5487 | - | - | - |
430
+ | 0.0330 | 18 | 31.6451 | - | - | - |
431
+ | 0.0349 | 19 | 30.7017 | - | - | - |
432
+ | 0.0367 | 20 | 29.8238 | - | - | - |
433
+ | 0.0385 | 21 | 28.7414 | - | - | - |
434
+ | 0.0404 | 22 | 27.316 | - | - | - |
435
+ | 0.0422 | 23 | 26.1119 | - | - | - |
436
+ | 0.0440 | 24 | 24.7211 | - | - | - |
437
+ | 0.0459 | 25 | 24.0007 | - | - | - |
438
+ | 0.0477 | 26 | 22.706 | - | - | - |
439
+ | 0.0495 | 27 | 21.7943 | - | - | - |
440
+ | 0.0514 | 28 | 21.5753 | - | - | - |
441
+ | 0.0532 | 29 | 20.9671 | - | - | - |
442
+ | 0.0550 | 30 | 20.5548 | - | - | - |
443
+ | 0.0569 | 31 | 20.263 | - | - | - |
444
+ | 0.0587 | 32 | 19.8474 | - | - | - |
445
+ | 0.0606 | 33 | 18.846 | - | - | - |
446
+ | 0.0624 | 34 | 18.5923 | - | - | - |
447
+ | 0.0642 | 35 | 17.8432 | - | - | - |
448
+ | 0.0661 | 36 | 17.6267 | - | - | - |
449
+ | 0.0679 | 37 | 17.1291 | - | - | - |
450
+ | 0.0697 | 38 | 16.6147 | - | - | - |
451
+ | 0.0716 | 39 | 16.1403 | - | - | - |
452
+ | 0.0734 | 40 | 16.5382 | - | - | - |
453
+ | 0.0752 | 41 | 15.7209 | - | - | - |
454
+ | 0.0771 | 42 | 15.565 | - | - | - |
455
+ | 0.0789 | 43 | 15.2099 | - | - | - |
456
+ | 0.0807 | 44 | 15.2644 | - | - | - |
457
+ | 0.0826 | 45 | 14.8458 | - | - | - |
458
+ | 0.0844 | 46 | 15.2214 | - | - | - |
459
+ | 0.0862 | 47 | 15.194 | - | - | - |
460
+ | 0.0881 | 48 | 15.53 | - | - | - |
461
+ | 0.0899 | 49 | 14.893 | - | - | - |
462
+ | 0.0917 | 50 | 14.4146 | - | - | - |
463
+ | 0.0936 | 51 | 14.4308 | - | - | - |
464
+ | 0.0954 | 52 | 13.8239 | - | - | - |
465
+ | 0.0972 | 53 | 13.9299 | - | - | - |
466
+ | 0.0991 | 54 | 14.6545 | - | - | - |
467
+ | 0.1009 | 55 | 14.3374 | - | - | - |
468
+ | 0.1028 | 56 | 14.5065 | - | - | - |
469
+ | 0.1046 | 57 | 13.8447 | - | - | - |
470
+ | 0.1064 | 58 | 14.179 | - | - | - |
471
+ | 0.1083 | 59 | 13.8866 | - | - | - |
472
+ | 0.1101 | 60 | 13.4879 | - | - | - |
473
+ | 0.1119 | 61 | 13.6273 | - | - | - |
474
+ | 0.1138 | 62 | 13.891 | - | - | - |
475
+ | 0.1156 | 63 | 13.6066 | - | - | - |
476
+ | 0.1174 | 64 | 13.4999 | - | - | - |
477
+ | 0.1193 | 65 | 13.9862 | - | - | - |
478
+ | 0.1211 | 66 | 13.4257 | - | - | - |
479
+ | 0.1229 | 67 | 13.9192 | - | - | - |
480
+ | 0.1248 | 68 | 13.5504 | - | - | - |
481
+ | 0.1266 | 69 | 13.3689 | - | - | - |
482
+ | 0.1284 | 70 | 13.4802 | - | - | - |
483
+ | 0.1303 | 71 | 13.0249 | - | - | - |
484
+ | 0.1321 | 72 | 13.2021 | - | - | - |
485
+ | 0.1339 | 73 | 13.1101 | - | - | - |
486
+ | 0.1358 | 74 | 13.0868 | - | - | - |
487
+ | 0.1376 | 75 | 12.8536 | - | - | - |
488
+ | 0.1394 | 76 | 12.9317 | - | - | - |
489
+ | 0.1413 | 77 | 12.6403 | - | - | - |
490
+ | 0.1431 | 78 | 12.9776 | - | - | - |
491
+ | 0.1450 | 79 | 13.1359 | - | - | - |
492
+ | 0.1468 | 80 | 13.0558 | - | - | - |
493
+ | 0.1486 | 81 | 13.0849 | - | - | - |
494
+ | 0.1505 | 82 | 12.6719 | - | - | - |
495
+ | 0.1523 | 83 | 12.5796 | - | - | - |
496
+ | 0.1541 | 84 | 12.472 | - | - | - |
497
+ | 0.1560 | 85 | 12.4221 | - | - | - |
498
+ | 0.1578 | 86 | 12.0878 | - | - | - |
499
+ | 0.1596 | 87 | 12.6923 | - | - | - |
500
+ | 0.1615 | 88 | 12.4428 | - | - | - |
501
+ | 0.1633 | 89 | 12.2897 | - | - | - |
502
+ | 0.1651 | 90 | 12.4254 | - | - | - |
503
+ | 0.1670 | 91 | 12.3808 | - | - | - |
504
+ | 0.1688 | 92 | 12.5224 | - | - | - |
505
+ | 0.1706 | 93 | 12.48 | - | - | - |
506
+ | 0.1725 | 94 | 11.8793 | - | - | - |
507
+ | 0.1743 | 95 | 11.8582 | - | - | - |
508
+ | 0.1761 | 96 | 12.5362 | - | - | - |
509
+ | 0.1780 | 97 | 12.3912 | - | - | - |
510
+ | 0.1798 | 98 | 12.7162 | - | - | - |
511
+ | 0.1817 | 99 | 12.4455 | - | - | - |
512
+ | 0.1835 | 100 | 12.4815 | 8.5398 | 0.8199 | - |
513
+ | 0.1853 | 101 | 12.1586 | - | - | - |
514
+ | 0.1872 | 102 | 11.8041 | - | - | - |
515
+ | 0.1890 | 103 | 11.6278 | - | - | - |
516
+ | 0.1908 | 104 | 11.8511 | - | - | - |
517
+ | 0.1927 | 105 | 11.762 | - | - | - |
518
+ | 0.1945 | 106 | 11.568 | - | - | - |
519
+ | 0.1963 | 107 | 11.8152 | - | - | - |
520
+ | 0.1982 | 108 | 11.9005 | - | - | - |
521
+ | 0.2 | 109 | 11.9282 | - | - | - |
522
+ | 0.2018 | 110 | 11.8451 | - | - | - |
523
+ | 0.2037 | 111 | 12.1208 | - | - | - |
524
+ | 0.2055 | 112 | 11.6718 | - | - | - |
525
+ | 0.2073 | 113 | 11.0296 | - | - | - |
526
+ | 0.2092 | 114 | 11.4185 | - | - | - |
527
+ | 0.2110 | 115 | 11.337 | - | - | - |
528
+ | 0.2128 | 116 | 10.9242 | - | - | - |
529
+ | 0.2147 | 117 | 11.0482 | - | - | - |
530
+ | 0.2165 | 118 | 11.3196 | - | - | - |
531
+ | 0.2183 | 119 | 11.1849 | - | - | - |
532
+ | 0.2202 | 120 | 10.9769 | - | - | - |
533
+ | 0.2220 | 121 | 10.5047 | - | - | - |
534
+ | 0.2239 | 122 | 11.1094 | - | - | - |
535
+ | 0.2257 | 123 | 11.2565 | - | - | - |
536
+ | 0.2275 | 124 | 11.1569 | - | - | - |
537
+ | 0.2294 | 125 | 11.5391 | - | - | - |
538
+ | 0.2312 | 126 | 10.8941 | - | - | - |
539
+ | 0.2330 | 127 | 10.8196 | - | - | - |
540
+ | 0.2349 | 128 | 11.0836 | - | - | - |
541
+ | 0.2367 | 129 | 11.4241 | - | - | - |
542
+ | 0.2385 | 130 | 11.4976 | - | - | - |
543
+ | 0.2404 | 131 | 10.938 | - | - | - |
544
+ | 0.2422 | 132 | 11.5283 | - | - | - |
545
+ | 0.2440 | 133 | 11.4238 | - | - | - |
546
+ | 0.2459 | 134 | 11.3364 | - | - | - |
547
+ | 0.2477 | 135 | 11.225 | - | - | - |
548
+ | 0.2495 | 136 | 11.0415 | - | - | - |
549
+ | 0.2514 | 137 | 10.8503 | - | - | - |
550
+ | 0.2532 | 138 | 10.9302 | - | - | - |
551
+ | 0.2550 | 139 | 10.5476 | - | - | - |
552
+ | 0.2569 | 140 | 10.8422 | - | - | - |
553
+ | 0.2587 | 141 | 10.4239 | - | - | - |
554
+ | 0.2606 | 142 | 10.5155 | - | - | - |
555
+ | 0.2624 | 143 | 10.589 | - | - | - |
556
+ | 0.2642 | 144 | 10.6116 | - | - | - |
557
+ | 0.2661 | 145 | 10.7158 | - | - | - |
558
+ | 0.2679 | 146 | 10.6952 | - | - | - |
559
+ | 0.2697 | 147 | 10.3678 | - | - | - |
560
+ | 0.2716 | 148 | 11.159 | - | - | - |
561
+ | 0.2734 | 149 | 11.3336 | - | - | - |
562
+ | 0.2752 | 150 | 10.7669 | - | - | - |
563
+ | 0.2771 | 151 | 10.5946 | - | - | - |
564
+ | 0.2789 | 152 | 10.9448 | - | - | - |
565
+ | 0.2807 | 153 | 10.7132 | - | - | - |
566
+ | 0.2826 | 154 | 10.5812 | - | - | - |
567
+ | 0.2844 | 155 | 10.7827 | - | - | - |
568
+ | 0.2862 | 156 | 10.7807 | - | - | - |
569
+ | 0.2881 | 157 | 10.7351 | - | - | - |
570
+ | 0.2899 | 158 | 10.7904 | - | - | - |
571
+ | 0.2917 | 159 | 10.5921 | - | - | - |
572
+ | 0.2936 | 160 | 10.2996 | - | - | - |
573
+ | 0.2954 | 161 | 10.2353 | - | - | - |
574
+ | 0.2972 | 162 | 10.2108 | - | - | - |
575
+ | 0.2991 | 163 | 10.089 | - | - | - |
576
+ | 0.3009 | 164 | 10.1736 | - | - | - |
577
+ | 0.3028 | 165 | 10.2599 | - | - | - |
578
+ | 0.3046 | 166 | 10.4347 | - | - | - |
579
+ | 0.3064 | 167 | 10.9999 | - | - | - |
580
+ | 0.3083 | 168 | 11.1655 | - | - | - |
581
+ | 0.3101 | 169 | 10.8125 | - | - | - |
582
+ | 0.3119 | 170 | 10.5497 | - | - | - |
583
+ | 0.3138 | 171 | 10.6918 | - | - | - |
584
+ | 0.3156 | 172 | 10.4792 | - | - | - |
585
+ | 0.3174 | 173 | 10.6018 | - | - | - |
586
+ | 0.3193 | 174 | 10.2092 | - | - | - |
587
+ | 0.3211 | 175 | 10.5625 | - | - | - |
588
+ | 0.3229 | 176 | 10.3539 | - | - | - |
589
+ | 0.3248 | 177 | 9.5403 | - | - | - |
590
+ | 0.3266 | 178 | 10.2351 | - | - | - |
591
+ | 0.3284 | 179 | 10.1557 | - | - | - |
592
+ | 0.3303 | 180 | 10.0721 | - | - | - |
593
+ | 0.3321 | 181 | 9.721 | - | - | - |
594
+ | 0.3339 | 182 | 9.7519 | - | - | - |
595
+ | 0.3358 | 183 | 9.7737 | - | - | - |
596
+ | 0.3376 | 184 | 9.5207 | - | - | - |
597
+ | 0.3394 | 185 | 9.6557 | - | - | - |
598
+ | 0.3413 | 186 | 9.7205 | - | - | - |
599
+ | 0.3431 | 187 | 9.9902 | - | - | - |
600
+ | 0.3450 | 188 | 10.1699 | - | - | - |
601
+ | 0.3468 | 189 | 10.5102 | - | - | - |
602
+ | 0.3486 | 190 | 10.2026 | - | - | - |
603
+ | 0.3505 | 191 | 10.1148 | - | - | - |
604
+ | 0.3523 | 192 | 9.5341 | - | - | - |
605
+ | 0.3541 | 193 | 9.5213 | - | - | - |
606
+ | 0.3560 | 194 | 9.7469 | - | - | - |
607
+ | 0.3578 | 195 | 10.1795 | - | - | - |
608
+ | 0.3596 | 196 | 10.3835 | - | - | - |
609
+ | 0.3615 | 197 | 10.7346 | - | - | - |
610
+ | 0.3633 | 198 | 9.9378 | - | - | - |
611
+ | 0.3651 | 199 | 9.7758 | - | - | - |
612
+ | 0.3670 | 200 | 10.3206 | 7.0991 | 0.8294 | - |
613
+ | 0.3688 | 201 | 9.7032 | - | - | - |
614
+ | 0.3706 | 202 | 9.8851 | - | - | - |
615
+ | 0.3725 | 203 | 9.9285 | - | - | - |
616
+ | 0.3743 | 204 | 10.0227 | - | - | - |
617
+ | 0.3761 | 205 | 9.8062 | - | - | - |
618
+ | 0.3780 | 206 | 9.9988 | - | - | - |
619
+ | 0.3798 | 207 | 10.0256 | - | - | - |
620
+ | 0.3817 | 208 | 9.8837 | - | - | - |
621
+ | 0.3835 | 209 | 10.0787 | - | - | - |
622
+ | 0.3853 | 210 | 9.5776 | - | - | - |
623
+ | 0.3872 | 211 | 9.6239 | - | - | - |
624
+ | 0.3890 | 212 | 9.717 | - | - | - |
625
+ | 0.3908 | 213 | 10.1639 | - | - | - |
626
+ | 0.3927 | 214 | 9.4994 | - | - | - |
627
+ | 0.3945 | 215 | 9.6895 | - | - | - |
628
+ | 0.3963 | 216 | 9.4938 | - | - | - |
629
+ | 0.3982 | 217 | 9.3008 | - | - | - |
630
+ | 0.4 | 218 | 9.6183 | - | - | - |
631
+ | 0.4018 | 219 | 9.3632 | - | - | - |
632
+ | 0.4037 | 220 | 9.3575 | - | - | - |
633
+ | 0.4055 | 221 | 9.4888 | - | - | - |
634
+ | 0.4073 | 222 | 9.337 | - | - | - |
635
+ | 0.4092 | 223 | 9.9598 | - | - | - |
636
+ | 0.4110 | 224 | 9.345 | - | - | - |
637
+ | 0.4128 | 225 | 9.2595 | - | - | - |
638
+ | 0.4147 | 226 | 9.3508 | - | - | - |
639
+ | 0.4165 | 227 | 9.8293 | - | - | - |
640
+ | 0.4183 | 228 | 9.8365 | - | - | - |
641
+ | 0.4202 | 229 | 9.6528 | - | - | - |
642
+ | 0.4220 | 230 | 9.9696 | - | - | - |
643
+ | 0.4239 | 231 | 10.113 | - | - | - |
644
+ | 0.4257 | 232 | 9.9706 | - | - | - |
645
+ | 0.4275 | 233 | 9.577 | - | - | - |
646
+ | 0.4294 | 234 | 9.7624 | - | - | - |
647
+ | 0.4312 | 235 | 9.5083 | - | - | - |
648
+ | 0.4330 | 236 | 9.5067 | - | - | - |
649
+ | 0.4349 | 237 | 9.1004 | - | - | - |
650
+ | 0.4367 | 238 | 8.914 | - | - | - |
651
+ | 0.4385 | 239 | 9.6852 | - | - | - |
652
+ | 0.4404 | 240 | 9.573 | - | - | - |
653
+ | 0.4422 | 241 | 9.8598 | - | - | - |
654
+ | 0.4440 | 242 | 10.1793 | - | - | - |
655
+ | 0.4459 | 243 | 10.2789 | - | - | - |
656
+ | 0.4477 | 244 | 9.9536 | - | - | - |
657
+ | 0.4495 | 245 | 9.3878 | - | - | - |
658
+ | 0.4514 | 246 | 9.6734 | - | - | - |
659
+ | 0.4532 | 247 | 9.3747 | - | - | - |
660
+ | 0.4550 | 248 | 8.8334 | - | - | - |
661
+ | 0.4569 | 249 | 9.7495 | - | - | - |
662
+ | 0.4587 | 250 | 8.8468 | - | - | - |
663
+ | 0.4606 | 251 | 9.3828 | - | - | - |
664
+ | 0.4624 | 252 | 9.1118 | - | - | - |
665
+ | 0.4642 | 253 | 9.3682 | - | - | - |
666
+ | 0.4661 | 254 | 9.3647 | - | - | - |
667
+ | 0.4679 | 255 | 9.8533 | - | - | - |
668
+ | 0.4697 | 256 | 9.2787 | - | - | - |
669
+ | 0.4716 | 257 | 8.9831 | - | - | - |
670
+ | 0.4734 | 258 | 9.0524 | - | - | - |
671
+ | 0.4752 | 259 | 9.5378 | - | - | - |
672
+ | 0.4771 | 260 | 9.4227 | - | - | - |
673
+ | 0.4789 | 261 | 9.3545 | - | - | - |
674
+ | 0.4807 | 262 | 8.8428 | - | - | - |
675
+ | 0.4826 | 263 | 9.1284 | - | - | - |
676
+ | 0.4844 | 264 | 8.7769 | - | - | - |
677
+ | 0.4862 | 265 | 9.0381 | - | - | - |
678
+ | 0.4881 | 266 | 9.0261 | - | - | - |
679
+ | 0.4899 | 267 | 8.811 | - | - | - |
680
+ | 0.4917 | 268 | 9.0848 | - | - | - |
681
+ | 0.4936 | 269 | 9.0951 | - | - | - |
682
+ | 0.4954 | 270 | 9.0682 | - | - | - |
683
+ | 0.4972 | 271 | 9.0418 | - | - | - |
684
+ | 0.4991 | 272 | 9.7316 | - | - | - |
685
+ | 0.5009 | 273 | 9.263 | - | - | - |
686
+ | 0.5028 | 274 | 9.624 | - | - | - |
687
+ | 0.5046 | 275 | 10.0133 | - | - | - |
688
+ | 0.5064 | 276 | 9.0789 | - | - | - |
689
+ | 0.5083 | 277 | 9.1399 | - | - | - |
690
+ | 0.5101 | 278 | 9.3854 | - | - | - |
691
+ | 0.5119 | 279 | 8.9982 | - | - | - |
692
+ | 0.5138 | 280 | 9.1342 | - | - | - |
693
+ | 0.5156 | 281 | 9.0517 | - | - | - |
694
+ | 0.5174 | 282 | 9.5637 | - | - | - |
695
+ | 0.5193 | 283 | 9.5213 | - | - | - |
696
+ | 0.5211 | 284 | 9.9231 | - | - | - |
697
+ | 0.5229 | 285 | 10.3441 | - | - | - |
698
+ | 0.5248 | 286 | 9.6162 | - | - | - |
699
+ | 0.5266 | 287 | 9.4794 | - | - | - |
700
+ | 0.5284 | 288 | 9.2728 | - | - | - |
701
+ | 0.5303 | 289 | 9.411 | - | - | - |
702
+ | 0.5321 | 290 | 9.5806 | - | - | - |
703
+ | 0.5339 | 291 | 9.4193 | - | - | - |
704
+ | 0.5358 | 292 | 9.3528 | - | - | - |
705
+ | 0.5376 | 293 | 9.7581 | - | - | - |
706
+ | 0.5394 | 294 | 9.4407 | - | - | - |
707
+ | 0.5413 | 295 | 9.027 | - | - | - |
708
+ | 0.5431 | 296 | 9.4272 | - | - | - |
709
+ | 0.5450 | 297 | 9.2733 | - | - | - |
710
+ | 0.5468 | 298 | 9.3 | - | - | - |
711
+ | 0.5486 | 299 | 9.6388 | - | - | - |
712
+ | 0.5505 | 300 | 9.0698 | 6.8356 | 0.8273 | - |
713
+ | 0.5523 | 301 | 9.4613 | - | - | - |
714
+ | 0.5541 | 302 | 9.9061 | - | - | - |
715
+ | 0.5560 | 303 | 9.3524 | - | - | - |
716
+ | 0.5578 | 304 | 9.1935 | - | - | - |
717
+ | 0.5596 | 305 | 9.1243 | - | - | - |
718
+ | 0.5615 | 306 | 8.8865 | - | - | - |
719
+ | 0.5633 | 307 | 9.4411 | - | - | - |
720
+ | 0.5651 | 308 | 9.1322 | - | - | - |
721
+ | 0.5670 | 309 | 9.3072 | - | - | - |
722
+ | 0.5688 | 310 | 8.4299 | - | - | - |
723
+ | 0.5706 | 311 | 8.9471 | - | - | - |
724
+ | 0.5725 | 312 | 8.5097 | - | - | - |
725
+ | 0.5743 | 313 | 9.1158 | - | - | - |
726
+ | 0.5761 | 314 | 9.0221 | - | - | - |
727
+ | 0.5780 | 315 | 9.5871 | - | - | - |
728
+ | 0.5798 | 316 | 9.3789 | - | - | - |
729
+ | 0.5817 | 317 | 9.1566 | - | - | - |
730
+ | 0.5835 | 318 | 9.0472 | - | - | - |
731
+ | 0.5853 | 319 | 8.947 | - | - | - |
732
+ | 0.5872 | 320 | 9.1791 | - | - | - |
733
+ | 0.5890 | 321 | 8.8764 | - | - | - |
734
+ | 0.5908 | 322 | 8.9794 | - | - | - |
735
+ | 0.5927 | 323 | 9.2044 | - | - | - |
736
+ | 0.5945 | 324 | 9.0374 | - | - | - |
737
+ | 0.5963 | 325 | 9.3389 | - | - | - |
738
+ | 0.5982 | 326 | 9.7387 | - | - | - |
739
+ | 0.6 | 327 | 9.4248 | - | - | - |
740
+ | 0.6018 | 328 | 9.4799 | - | - | - |
741
+ | 0.6037 | 329 | 8.9019 | - | - | - |
742
+ | 0.6055 | 330 | 9.113 | - | - | - |
743
+ | 0.6073 | 331 | 9.3148 | - | - | - |
744
+ | 0.6092 | 332 | 8.9871 | - | - | - |
745
+ | 0.6110 | 333 | 8.5404 | - | - | - |
746
+ | 0.6128 | 334 | 9.1587 | - | - | - |
747
+ | 0.6147 | 335 | 8.9698 | - | - | - |
748
+ | 0.6165 | 336 | 9.3393 | - | - | - |
749
+ | 0.6183 | 337 | 9.4845 | - | - | - |
750
+ | 0.6202 | 338 | 9.6075 | - | - | - |
751
+ | 0.6220 | 339 | 9.426 | - | - | - |
752
+ | 0.6239 | 340 | 9.0633 | - | - | - |
753
+ | 0.6257 | 341 | 9.1017 | - | - | - |
754
+ | 0.6275 | 342 | 9.2461 | - | - | - |
755
+ | 0.6294 | 343 | 9.065 | - | - | - |
756
+ | 0.6312 | 344 | 9.4668 | - | - | - |
757
+ | 0.6330 | 345 | 9.0267 | - | - | - |
758
+ | 0.6349 | 346 | 9.2938 | - | - | - |
759
+ | 0.6367 | 347 | 9.391 | - | - | - |
760
+ | 0.6385 | 348 | 9.2386 | - | - | - |
761
+ | 0.6404 | 349 | 9.5285 | - | - | - |
762
+ | 0.6422 | 350 | 9.5958 | - | - | - |
763
+ | 0.6440 | 351 | 9.157 | - | - | - |
764
+ | 0.6459 | 352 | 9.4166 | - | - | - |
765
+ | 0.6477 | 353 | 9.358 | - | - | - |
766
+ | 0.6495 | 354 | 9.4497 | - | - | - |
767
+ | 0.6514 | 355 | 9.407 | - | - | - |
768
+ | 0.6532 | 356 | 9.1505 | - | - | - |
769
+ | 0.6550 | 357 | 9.403 | - | - | - |
770
+ | 0.6569 | 358 | 9.1949 | - | - | - |
771
+ | 0.6587 | 359 | 8.7922 | - | - | - |
772
+ | 0.6606 | 360 | 8.883 | - | - | - |
773
+ | 0.6624 | 361 | 8.6828 | - | - | - |
774
+ | 0.6642 | 362 | 8.5654 | - | - | - |
775
+ | 0.6661 | 363 | 8.705 | - | - | - |
776
+ | 0.6679 | 364 | 8.8329 | - | - | - |
777
+ | 0.6697 | 365 | 9.1604 | - | - | - |
778
+ | 0.6716 | 366 | 9.1609 | - | - | - |
779
+ | 0.6734 | 367 | 9.4693 | - | - | - |
780
+ | 0.6752 | 368 | 9.1431 | - | - | - |
781
+ | 0.6771 | 369 | 8.7564 | - | - | - |
782
+ | 0.6789 | 370 | 9.1378 | - | - | - |
783
+ | 0.6807 | 371 | 8.8472 | - | - | - |
784
+ | 0.6826 | 372 | 8.9159 | - | - | - |
785
+ | 0.6844 | 373 | 8.9551 | - | - | - |
786
+ | 0.6862 | 374 | 9.2721 | - | - | - |
787
+ | 0.6881 | 375 | 8.7511 | - | - | - |
788
+ | 0.6899 | 376 | 9.1683 | - | - | - |
789
+ | 0.6917 | 377 | 8.8438 | - | - | - |
790
+ | 0.6936 | 378 | 8.6151 | - | - | - |
791
+ | 0.6954 | 379 | 8.7015 | - | - | - |
792
+ | 0.6972 | 380 | 7.6009 | - | - | - |
793
+ | 0.6991 | 381 | 7.3242 | - | - | - |
794
+ | 0.7009 | 382 | 7.4182 | - | - | - |
795
+ | 0.7028 | 383 | 7.2576 | - | - | - |
796
+ | 0.7046 | 384 | 7.0578 | - | - | - |
797
+ | 0.7064 | 385 | 6.0212 | - | - | - |
798
+ | 0.7083 | 386 | 5.9868 | - | - | - |
799
+ | 0.7101 | 387 | 6.033 | - | - | - |
800
+ | 0.7119 | 388 | 5.8085 | - | - | - |
801
+ | 0.7138 | 389 | 5.6002 | - | - | - |
802
+ | 0.7156 | 390 | 5.439 | - | - | - |
803
+ | 0.7174 | 391 | 5.1661 | - | - | - |
804
+ | 0.7193 | 392 | 5.1261 | - | - | - |
805
+ | 0.7211 | 393 | 5.5393 | - | - | - |
806
+ | 0.7229 | 394 | 4.8909 | - | - | - |
807
+ | 0.7248 | 395 | 5.2803 | - | - | - |
808
+ | 0.7266 | 396 | 5.1639 | - | - | - |
809
+ | 0.7284 | 397 | 4.7125 | - | - | - |
810
+ | 0.7303 | 398 | 4.842 | - | - | - |
811
+ | 0.7321 | 399 | 5.0971 | - | - | - |
812
+ | 0.7339 | 400 | 4.5101 | 5.0650 | 0.8590 | - |
813
+ | 0.7358 | 401 | 4.3422 | - | - | - |
814
+ | 0.7376 | 402 | 4.719 | - | - | - |
815
+ | 0.7394 | 403 | 4.1823 | - | - | - |
816
+ | 0.7413 | 404 | 3.7903 | - | - | - |
817
+ | 0.7431 | 405 | 3.886 | - | - | - |
818
+ | 0.7450 | 406 | 4.1115 | - | - | - |
819
+ | 0.7468 | 407 | 3.9201 | - | - | - |
820
+ | 0.7486 | 408 | 3.9291 | - | - | - |
821
+ | 0.7505 | 409 | 4.0412 | - | - | - |
822
+ | 0.7523 | 410 | 3.6614 | - | - | - |
823
+ | 0.7541 | 411 | 3.5718 | - | - | - |
824
+ | 0.7560 | 412 | 3.6689 | - | - | - |
825
+ | 0.7578 | 413 | 3.7457 | - | - | - |
826
+ | 0.7596 | 414 | 3.4272 | - | - | - |
827
+ | 0.7615 | 415 | 3.5112 | - | - | - |
828
+ | 0.7633 | 416 | 3.8348 | - | - | - |
829
+ | 0.7651 | 417 | 3.5177 | - | - | - |
830
+ | 0.7670 | 418 | 3.3441 | - | - | - |
831
+ | 0.7688 | 419 | 3.362 | - | - | - |
832
+ | 0.7706 | 420 | 3.4926 | - | - | - |
833
+ | 0.7725 | 421 | 3.4722 | - | - | - |
834
+ | 0.7743 | 422 | 2.8568 | - | - | - |
835
+ | 0.7761 | 423 | 3.3396 | - | - | - |
836
+ | 0.7780 | 424 | 2.972 | - | - | - |
837
+ | 0.7798 | 425 | 3.6889 | - | - | - |
838
+ | 0.7817 | 426 | 3.5154 | - | - | - |
839
+ | 0.7835 | 427 | 3.4098 | - | - | - |
840
+ | 0.7853 | 428 | 3.4569 | - | - | - |
841
+ | 0.7872 | 429 | 3.4916 | - | - | - |
842
+ | 0.7890 | 430 | 3.7394 | - | - | - |
843
+ | 0.7908 | 431 | 3.332 | - | - | - |
844
+ | 0.7927 | 432 | 3.3767 | - | - | - |
845
+ | 0.7945 | 433 | 3.1173 | - | - | - |
846
+ | 0.7963 | 434 | 3.2257 | - | - | - |
847
+ | 0.7982 | 435 | 3.3629 | - | - | - |
848
+ | 0.8 | 436 | 3.1992 | - | - | - |
849
+ | 0.8018 | 437 | 3.1252 | - | - | - |
850
+ | 0.8037 | 438 | 3.5155 | - | - | - |
851
+ | 0.8055 | 439 | 3.2583 | - | - | - |
852
+ | 0.8073 | 440 | 2.9001 | - | - | - |
853
+ | 0.8092 | 441 | 3.1542 | - | - | - |
854
+ | 0.8110 | 442 | 3.0473 | - | - | - |
855
+ | 0.8128 | 443 | 3.0446 | - | - | - |
856
+ | 0.8147 | 444 | 3.3807 | - | - | - |
857
+ | 0.8165 | 445 | 3.1246 | - | - | - |
858
+ | 0.8183 | 446 | 3.1922 | - | - | - |
859
+ | 0.8202 | 447 | 3.09 | - | - | - |
860
+ | 0.8220 | 448 | 3.4341 | - | - | - |
861
+ | 0.8239 | 449 | 3.0926 | - | - | - |
862
+ | 0.8257 | 450 | 2.9746 | - | - | - |
863
+ | 0.8275 | 451 | 3.1014 | - | - | - |
864
+ | 0.8294 | 452 | 3.2205 | - | - | - |
865
+ | 0.8312 | 453 | 3.1147 | - | - | - |
866
+ | 0.8330 | 454 | 2.9682 | - | - | - |
867
+ | 0.8349 | 455 | 3.1681 | - | - | - |
868
+ | 0.8367 | 456 | 2.9821 | - | - | - |
869
+ | 0.8385 | 457 | 2.8484 | - | - | - |
870
+ | 0.8404 | 458 | 3.0341 | - | - | - |
871
+ | 0.8422 | 459 | 3.0632 | - | - | - |
872
+ | 0.8440 | 460 | 3.2026 | - | - | - |
873
+ | 0.8459 | 461 | 3.132 | - | - | - |
874
+ | 0.8477 | 462 | 3.0209 | - | - | - |
875
+ | 0.8495 | 463 | 2.7183 | - | - | - |
876
+ | 0.8514 | 464 | 3.0257 | - | - | - |
877
+ | 0.8532 | 465 | 3.1462 | - | - | - |
878
+ | 0.8550 | 466 | 2.8747 | - | - | - |
879
+ | 0.8569 | 467 | 3.0932 | - | - | - |
880
+ | 0.8587 | 468 | 3.0097 | - | - | - |
881
+ | 0.8606 | 469 | 3.0956 | - | - | - |
882
+ | 0.8624 | 470 | 3.019 | - | - | - |
883
+ | 0.8642 | 471 | 3.1342 | - | - | - |
884
+ | 0.8661 | 472 | 2.688 | - | - | - |
885
+ | 0.8679 | 473 | 2.8892 | - | - | - |
886
+ | 0.8697 | 474 | 3.1589 | - | - | - |
887
+ | 0.8716 | 475 | 2.9274 | - | - | - |
888
+ | 0.8734 | 476 | 2.8554 | - | - | - |
889
+ | 0.8752 | 477 | 2.694 | - | - | - |
890
+ | 0.8771 | 478 | 2.7397 | - | - | - |
891
+ | 0.8789 | 479 | 2.6452 | - | - | - |
892
+ | 0.8807 | 480 | 3.0158 | - | - | - |
893
+ | 0.8826 | 481 | 3.0148 | - | - | - |
894
+ | 0.8844 | 482 | 2.5704 | - | - | - |
895
+ | 0.8862 | 483 | 2.6755 | - | - | - |
896
+ | 0.8881 | 484 | 2.7805 | - | - | - |
897
+ | 0.8899 | 485 | 2.8554 | - | - | - |
898
+ | 0.8917 | 486 | 2.6966 | - | - | - |
899
+ | 0.8936 | 487 | 2.8759 | - | - | - |
900
+ | 0.8954 | 488 | 2.8838 | - | - | - |
901
+ | 0.8972 | 489 | 2.7885 | - | - | - |
902
+ | 0.8991 | 490 | 2.7576 | - | - | - |
903
+ | 0.9009 | 491 | 2.9139 | - | - | - |
904
+ | 0.9028 | 492 | 2.6583 | - | - | - |
905
+ | 0.9046 | 493 | 2.9654 | - | - | - |
906
+ | 0.9064 | 494 | 2.551 | - | - | - |
907
+ | 0.9083 | 495 | 2.5596 | - | - | - |
908
+ | 0.9101 | 496 | 2.9595 | - | - | - |
909
+ | 0.9119 | 497 | 2.8677 | - | - | - |
910
+ | 0.9138 | 498 | 2.5793 | - | - | - |
911
+ | 0.9156 | 499 | 2.5415 | - | - | - |
912
+ | 0.9174 | 500 | 2.9738 | 4.8764 | 0.8651 | - |
913
+ | 0.9193 | 501 | 2.5838 | - | - | - |
914
+ | 0.9211 | 502 | 2.6544 | - | - | - |
915
+ | 0.9229 | 503 | 2.7046 | - | - | - |
916
+ | 0.9248 | 504 | 2.6339 | - | - | - |
917
+ | 0.9266 | 505 | 2.687 | - | - | - |
918
+ | 0.9284 | 506 | 2.7863 | - | - | - |
919
+ | 0.9303 | 507 | 2.7409 | - | - | - |
920
+ | 0.9321 | 508 | 2.656 | - | - | - |
921
+ | 0.9339 | 509 | 2.7456 | - | - | - |
922
+ | 0.9358 | 510 | 2.6589 | - | - | - |
923
+ | 0.9376 | 511 | 2.697 | - | - | - |
924
+ | 0.9394 | 512 | 2.6443 | - | - | - |
925
+ | 0.9413 | 513 | 2.7357 | - | - | - |
926
+ | 0.9431 | 514 | 2.969 | - | - | - |
927
+ | 0.9450 | 515 | 2.4175 | - | - | - |
928
+ | 0.9468 | 516 | 2.5424 | - | - | - |
929
+ | 0.9486 | 517 | 2.4773 | - | - | - |
930
+ | 0.9505 | 518 | 2.6269 | - | - | - |
931
+ | 0.9523 | 519 | 2.6288 | - | - | - |
932
+ | 0.9541 | 520 | 2.9471 | - | - | - |
933
+ | 0.9560 | 521 | 2.9775 | - | - | - |
934
+ | 0.9578 | 522 | 2.9949 | - | - | - |
935
+ | 0.9596 | 523 | 2.7084 | - | - | - |
936
+ | 0.9615 | 524 | 2.6431 | - | - | - |
937
+ | 0.9633 | 525 | 2.5849 | - | - | - |
938
+ | 0.9651 | 526 | 7.353 | - | - | - |
939
+ | 0.9670 | 527 | 9.1463 | - | - | - |
940
+ | 0.9688 | 528 | 10.9846 | - | - | - |
941
+ | 0.9706 | 529 | 10.6362 | - | - | - |
942
+ | 0.9725 | 530 | 10.0763 | - | - | - |
943
+ | 0.9743 | 531 | 9.7147 | - | - | - |
944
+ | 0.9761 | 532 | 9.3911 | - | - | - |
945
+ | 0.9780 | 533 | 9.3722 | - | - | - |
946
+ | 0.9798 | 534 | 10.794 | - | - | - |
947
+ | 0.9817 | 535 | 11.661 | - | - | - |
948
+ | 0.9835 | 536 | 11.4706 | - | - | - |
949
+ | 0.9853 | 537 | 12.0868 | - | - | - |
950
+ | 0.9872 | 538 | 12.0017 | - | - | - |
951
+ | 0.9890 | 539 | 11.7965 | - | - | - |
952
+ | 0.9908 | 540 | 12.5961 | - | - | - |
953
+ | 0.9927 | 541 | 9.6563 | - | - | - |
954
+ | 0.9945 | 542 | 11.5097 | - | - | - |
955
+ | 0.9963 | 543 | 12.0945 | - | - | - |
956
+ | 0.9982 | 544 | 10.7032 | - | - | - |
957
+ | 1.0 | 545 | 10.5622 | - | - | 0.8505 |
958
+
959
+ </details>
960
+
961
+ ### Framework Versions
962
+ - Python: 3.10.12
963
+ - Sentence Transformers: 3.3.1
964
+ - Transformers: 4.48.0.dev0
965
+ - PyTorch: 2.1.0+cu118
966
+ - Accelerate: 1.2.1
967
+ - Datasets: 3.2.0
968
+ - Tokenizers: 0.21.0
969
+
970
+ ## Citation
971
+
972
+ ### BibTeX
973
+
974
+ #### Sentence Transformers
975
+ ```bibtex
976
+ @inproceedings{reimers-2019-sentence-bert,
977
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
978
+ author = "Reimers, Nils and Gurevych, Iryna",
979
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
980
+ month = "11",
981
+ year = "2019",
982
+ publisher = "Association for Computational Linguistics",
983
+ url = "https://arxiv.org/abs/1908.10084",
984
+ }
985
+ ```
986
+
987
+ #### MatryoshkaLoss
988
+ ```bibtex
989
+ @misc{kusupati2024matryoshka,
990
+ title={Matryoshka Representation Learning},
991
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
992
+ year={2024},
993
+ eprint={2205.13147},
994
+ archivePrefix={arXiv},
995
+ primaryClass={cs.LG}
996
+ }
997
+ ```
998
+
999
+ #### MultipleNegativesRankingLoss
1000
+ ```bibtex
1001
+ @misc{henderson2017efficient,
1002
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
1003
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
1004
+ year={2017},
1005
+ eprint={1705.00652},
1006
+ archivePrefix={arXiv},
1007
+ primaryClass={cs.CL}
1008
+ }
1009
+ ```
1010
+
1011
+ <!--
1012
+ ## Glossary
1013
+
1014
+ *Clearly define terms in order to be accessible across audiences.*
1015
+ -->
1016
+
1017
+ <!--
1018
+ ## Model Card Authors
1019
+
1020
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1021
+ -->
1022
+
1023
+ <!--
1024
+ ## Model Card Contact
1025
+
1026
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1027
+ -->
config.json ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "estrogen/ModernBERT-base-sbert-initialized",
3
+ "architectures": [
4
+ "ModernBertModel"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 50281,
9
+ "classifier_activation": "gelu",
10
+ "classifier_bias": false,
11
+ "classifier_dropout": 0.0,
12
+ "classifier_pooling": "mean",
13
+ "cls_token_id": 50281,
14
+ "decoder_bias": true,
15
+ "deterministic_flash_attn": false,
16
+ "embedding_dropout": 0.0,
17
+ "eos_token_id": 50282,
18
+ "global_attn_every_n_layers": 3,
19
+ "global_rope_theta": 160000.0,
20
+ "gradient_checkpointing": false,
21
+ "hidden_activation": "gelu",
22
+ "hidden_size": 768,
23
+ "initializer_cutoff_factor": 2.0,
24
+ "initializer_range": 0.02,
25
+ "intermediate_size": 1152,
26
+ "layer_norm_eps": 1e-05,
27
+ "local_attention": 128,
28
+ "local_rope_theta": 10000.0,
29
+ "max_position_embeddings": 8192,
30
+ "mlp_bias": false,
31
+ "mlp_dropout": 0.0,
32
+ "model_type": "modernbert",
33
+ "norm_bias": false,
34
+ "norm_eps": 1e-05,
35
+ "num_attention_heads": 12,
36
+ "num_hidden_layers": 22,
37
+ "pad_token_id": 50283,
38
+ "position_embedding_type": "absolute",
39
+ "reference_compile": true,
40
+ "sep_token_id": 50282,
41
+ "sparse_pred_ignore_index": -100,
42
+ "sparse_prediction": false,
43
+ "torch_dtype": "float32",
44
+ "transformers_version": "4.48.0.dev0",
45
+ "vocab_size": 50368
46
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.1",
4
+ "transformers": "4.48.0.dev0",
5
+ "pytorch": "2.1.0+cu118"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0414f69f9ea8e2a95b0391dd543ce718666e694203333399f249cf8f48b355f7
3
+ size 596070136
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 8192,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "|||IP_ADDRESS|||",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": false
10
+ },
11
+ "1": {
12
+ "content": "<|padding|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "50254": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "50255": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "50256": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "50257": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "50258": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "50259": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "50260": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "50261": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "50262": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "50263": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "50264": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "50265": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "50266": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "50267": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "50268": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "50269": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "50270": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "50271": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "50272": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "50273": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "50274": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "50275": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "50276": {
196
+ "content": " ",
197
+ "lstrip": false,
198
+ "normalized": true,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": false
202
+ },
203
+ "50277": {
204
+ "content": "|||EMAIL_ADDRESS|||",
205
+ "lstrip": false,
206
+ "normalized": true,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": false
210
+ },
211
+ "50278": {
212
+ "content": "|||PHONE_NUMBER|||",
213
+ "lstrip": false,
214
+ "normalized": true,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": false
218
+ },
219
+ "50279": {
220
+ "content": "<|endoftext|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "50280": {
228
+ "content": "[UNK]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "50281": {
236
+ "content": "[CLS]",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "50282": {
244
+ "content": "[SEP]",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "50283": {
252
+ "content": "[PAD]",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "50284": {
260
+ "content": "[MASK]",
261
+ "lstrip": true,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "50285": {
268
+ "content": "[unused0]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "50286": {
276
+ "content": "[unused1]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "50287": {
284
+ "content": "[unused2]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "50288": {
292
+ "content": "[unused3]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "50289": {
300
+ "content": "[unused4]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "50290": {
308
+ "content": "[unused5]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "50291": {
316
+ "content": "[unused6]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "50292": {
324
+ "content": "[unused7]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "50293": {
332
+ "content": "[unused8]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "50294": {
340
+ "content": "[unused9]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "50295": {
348
+ "content": "[unused10]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "50296": {
356
+ "content": "[unused11]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "50297": {
364
+ "content": "[unused12]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "50298": {
372
+ "content": "[unused13]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "50299": {
380
+ "content": "[unused14]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "50300": {
388
+ "content": "[unused15]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "50301": {
396
+ "content": "[unused16]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "50302": {
404
+ "content": "[unused17]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "50303": {
412
+ "content": "[unused18]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "50304": {
420
+ "content": "[unused19]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "50305": {
428
+ "content": "[unused20]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "50306": {
436
+ "content": "[unused21]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "50307": {
444
+ "content": "[unused22]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "50308": {
452
+ "content": "[unused23]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "50309": {
460
+ "content": "[unused24]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "50310": {
468
+ "content": "[unused25]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "50311": {
476
+ "content": "[unused26]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "50312": {
484
+ "content": "[unused27]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "50313": {
492
+ "content": "[unused28]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "50314": {
500
+ "content": "[unused29]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "50315": {
508
+ "content": "[unused30]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "50316": {
516
+ "content": "[unused31]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "50317": {
524
+ "content": "[unused32]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "50318": {
532
+ "content": "[unused33]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "50319": {
540
+ "content": "[unused34]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "50320": {
548
+ "content": "[unused35]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "50321": {
556
+ "content": "[unused36]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "50322": {
564
+ "content": "[unused37]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "50323": {
572
+ "content": "[unused38]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "50324": {
580
+ "content": "[unused39]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "50325": {
588
+ "content": "[unused40]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "50326": {
596
+ "content": "[unused41]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "50327": {
604
+ "content": "[unused42]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "50328": {
612
+ "content": "[unused43]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "50329": {
620
+ "content": "[unused44]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "50330": {
628
+ "content": "[unused45]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "50331": {
636
+ "content": "[unused46]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "50332": {
644
+ "content": "[unused47]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "50333": {
652
+ "content": "[unused48]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "50334": {
660
+ "content": "[unused49]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "50335": {
668
+ "content": "[unused50]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "50336": {
676
+ "content": "[unused51]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "50337": {
684
+ "content": "[unused52]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "50338": {
692
+ "content": "[unused53]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "50339": {
700
+ "content": "[unused54]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "50340": {
708
+ "content": "[unused55]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "50341": {
716
+ "content": "[unused56]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "50342": {
724
+ "content": "[unused57]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "50343": {
732
+ "content": "[unused58]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "50344": {
740
+ "content": "[unused59]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "50345": {
748
+ "content": "[unused60]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "50346": {
756
+ "content": "[unused61]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "50347": {
764
+ "content": "[unused62]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "50348": {
772
+ "content": "[unused63]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "50349": {
780
+ "content": "[unused64]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "50350": {
788
+ "content": "[unused65]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "50351": {
796
+ "content": "[unused66]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "50352": {
804
+ "content": "[unused67]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "50353": {
812
+ "content": "[unused68]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "50354": {
820
+ "content": "[unused69]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "50355": {
828
+ "content": "[unused70]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "50356": {
836
+ "content": "[unused71]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "50357": {
844
+ "content": "[unused72]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "50358": {
852
+ "content": "[unused73]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "50359": {
860
+ "content": "[unused74]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "50360": {
868
+ "content": "[unused75]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "50361": {
876
+ "content": "[unused76]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "50362": {
884
+ "content": "[unused77]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "50363": {
892
+ "content": "[unused78]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "50364": {
900
+ "content": "[unused79]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "50365": {
908
+ "content": "[unused80]",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "50366": {
916
+ "content": "[unused81]",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "50367": {
924
+ "content": "[unused82]",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ }
931
+ },
932
+ "clean_up_tokenization_spaces": true,
933
+ "cls_token": "[CLS]",
934
+ "extra_special_tokens": {},
935
+ "mask_token": "[MASK]",
936
+ "model_input_names": [
937
+ "input_ids",
938
+ "attention_mask"
939
+ ],
940
+ "model_max_length": 8192,
941
+ "pad_token": "[PAD]",
942
+ "sep_token": "[SEP]",
943
+ "tokenizer_class": "PreTrainedTokenizerFast",
944
+ "unk_token": "[UNK]"
945
+ }