SentenceTransformer
This is a sentence-transformers model trained on the parquet dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Maximum Sequence Length: 1024 tokens
- Output Dimensionality: 384 tokens
- Similarity Function: Cosine Similarity
- Training Dataset:
- parquet
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("pankajrajdeo/UMLS-Pubmed-TCE-Epoch-4")
# Run inference
sentences = [
'[YEAR_RANGE] 2020-2024 [TEXT] Intraoperative Monitoring of the External Urethral Sphincter Reflex: A Novel Adjunct to Bulbocavernosus Reflex Neuromonitoring for Protecting the Sacral Neural Pathways Responsible for Urination, Defecation and Sexual Function.',
'[YEAR_RANGE] 2020-2024 [TEXT] PURPOSE: Intraoperative bulbocavernosus reflex neuromonitoring has been utilized to protect bowel, bladder, and sexual function, providing a continuous functional assessment of the somatic sacral nervous system during surgeries where it is at risk. Bulbocavernosus reflex data may also provide additional functional insight, including an evaluation for spinal shock, distinguishing upper versus lower motor neuron injury (conus versus cauda syndromes) and prognosis for postoperative bowel and bladder function. Continuous intraoperative bulbocavernosus reflex monitoring has been utilized to provide the surgeon with an ongoing functional assessment of the anatomical elements involved in the S2-S4 mediated reflex arc including the conus, cauda equina and pudendal nerves. Intraoperative bulbocavernosus reflex monitoring typically includes the electrical activation of the dorsal nerves of the genitals to initiate the afferent component of the reflex, followed by recording the resulting muscle response using needle electromyography recordings from the external anal sphincter. METHODS: Herein we describe a complementary and novel technique that includes recording electromyography responses from the external urethral sphincter to monitor the external urethral sphincter reflex. Specialized foley catheters embedded with recording electrodes have recently become commercially available that provide the ability to perform intraoperative external urethral sphincter muscle recordings. RESULTS: We describe technical details and the potential utility of incorporating external urethral sphincter reflex recordings into existing sacral neuromonitoring paradigms to provide redundant yet complementary data streams. CONCLUSIONS: We present two illustrative neurosurgical oncology cases to demonstrate the utility of the external urethral sphincter reflex technique in the setting of the necessary surgical sacrifice of sacral nerve roots.',
'[YEAR_RANGE] 2020-2024 [TEXT] Early menarche has been associated with adverse health outcomes, such as depressive symptoms. Discovering effect modifiers across these conditions in the pediatric population is a constant challenge. We tested whether movement behaviours modified the effect of the association between early menarche and depression symptoms among adolescents. This cross-sectional study included 2031 females aged 15-19 years across all Brazilian geographic regions. Data were collected using a self-administered questionnaire; 30.5% (n = 620) reported having experienced menarche before age 12 years (that is, early menarche). We used the Patient Health Questionnaire (PHQ-9) to evaluate depressive symptoms. Accruing any moderate-vigorous physical activity during leisure time, limited recreational screen time, and having good sleep quality were the exposures investigated. Adolescents who experienced early menarche and met one (B: -4.45, 95% CI: (-5.38, -3.51)), two (B: -6.07 (-7.02, -5.12)), or three (B: -6.49 (-7.76, -5.21)), and adolescents who experienced not early menarche and met one (B: -5.33 (-6.20; -4.46)), two (B: -6.12 (-6.99; -5.24)), or three (B: -6.27 (-7.30; -5.24)) of the movement behaviour targets had lower PHQ-9 scores for depression symptoms than adolescents who experienced early menarche and did not meet any of the movement behaviours. The disparities in depressive symptoms among the adolescents (early menarche versus not early menarche) who adhered to all three target behaviours were not statistically significant (B: 0.41 (-0.19; 1.01)). Adherence to movement behaviours modified the effect of the association between early menarche and depression symptoms.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
parquet
- Dataset: parquet
- Size: 26,147,930 training samples
- Columns:
anchor
andpositive
- Approximate statistics based on the first 1000 samples:
anchor positive type string string details - min: 28 tokens
- mean: 57.85 tokens
- max: 149 tokens
- min: 43 tokens
- mean: 319.08 tokens
- max: 1024 tokens
- Samples:
anchor positive [YEAR_RANGE] 1880-1884 [TEXT] ADDRESS OF COL. GARRICK MALLERY, U. S. ARMY.
[YEAR_RANGE] 1880-1884 [TEXT] It may be conceded that after man had all his present faculties, he did not choose between the adoption of voice and gesture, and never with those faculties, was in a state where the one was used, to the absolute exclusion of the other. The epoch, however, to which our speculations relate is that in which he had not reached the present symmetric development of his intellect and of his bodily organs, and the inquiry is: Which mode of communication was earliest adopted to his single wants and informed intelligence? With the voice he could imitate distinictively but few sounds of nature, while with gesture he could exhibit actions, motions, positions, forms, dimensions, directions and distances, with their derivations and analogues. It would seem from this unequal division of capacity that oral speech remained rudimentary long after gesture had become an efficient mode of communication. With due allowance for all purely imitative sounds, and for the spontaneous action of vocal organs under excitement, it appears that the connection between ideas and words is only to be explained by a compact between speaker and hearer which supposes the existence of a prior mode of communication. This was probably by gesture. At least we may accept it as a clew leading out of the labyrinth of philological confusion, and regulating the immemorial quest of man's primitive speech.
[YEAR_RANGE] 1880-1884 [TEXT] How TO OBTAIN THE BRAIN OF THE CAT.
[YEAR_RANGE] 1880-1884 [TEXT] How to obtain the Brain of the Cat, (Wilder).-Correction: Page 158, second column, line 7, "grains," should be "grams;" page 159, near middle of 2nd column, "successily," should be "successively;" page 161, the number of Flower's paper is 3.
[YEAR_RANGE] 1880-1884 [TEXT] DOLBEAR ON THE NATURE AND CONSTITUTION OF MATTER.
[YEAR_RANGE] 1880-1884 [TEXT] Mr. Dopp desires to make the following correction in his paper in the last issue: "In my article on page 200 of "Science", the expression and should have been and being the velocity of light.
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
parquet
- Dataset: parquet
- Size: 26,147,930 evaluation samples
- Columns:
anchor
andpositive
- Approximate statistics based on the first 1000 samples:
anchor positive type string string details - min: 27 tokens
- mean: 43.78 tokens
- max: 90 tokens
- min: 28 tokens
- mean: 315.03 tokens
- max: 847 tokens
- Samples:
anchor positive [YEAR_RANGE] 2020-2024 [TEXT] Solubility and thermodynamics of mesalazine in aqueous mixtures of poly ethylene glycol 200/600 at 293.2-313.2K.
[YEAR_RANGE] 2020-2024 [TEXT] In this study, the solubility of mesalazine was investigated in binary solvent mixtures of poly ethylene glycols 200/600 and water at temperatures ranging from 293.2K to 313.2K. The solubility of mesalazine was determined using a shake-flask method, and its concentrations were measured using a UV-Vis spectrophotometer. The obtained solubility data were analyzed using mathematical models including the van't Hoff, Jouyban-Acree, Jouyban-Acree-van't Hoff, mixture response surface, and modified Wilson models. The experimental data obtained for mesalazine dissolution encompassed various thermodynamic properties, including ΔG°, ΔH°, ΔS°, and TΔS°. These properties offer valuable insights into the energetic aspects of the dissolution process and were calculated based on the van't Hoff equation.
[YEAR_RANGE] 2020-2024 [TEXT] Safety and efficacy of remimazolam versus propofol during EUS: a multicenter randomized controlled study.
[YEAR_RANGE] 2020-2024 [TEXT] BACKGROUND AND AIMS: Propofol, a widely used sedative in GI endoscopic procedures, is associated with cardiorespiratory suppression. Remimazolam is a novel ultrashort-acting benzodiazepine sedative with rapid onset and minimal cardiorespiratory depression. This study compared the safety and efficacy of remimazolam and propofol during EUS procedures. METHODS: A multicenter randomized controlled study was conducted between October 2022 and March 2023 in patients who underwent EUS procedures. Patients were randomly assigned to receive either remimazolam or propofol as a sedative agent. The primary endpoint was cardiorespiratory adverse events.
[YEAR_RANGE] 2020-2024 [TEXT] Ultrasound-Guided Vs Non-Guided Prolotherapy for Internal Derangement of Temporomandibular Joint. A Randomized Clinical Trial.
[YEAR_RANGE] 2020-2024 [TEXT] OBJECTIVES: This randomized clinical trial study aims to compare ultrasound-guided versus non-guided Dextrose 10% injections in patients suffering from internal derangement in the temporomandibular joint (TMJ). MATERIAL AND METHODS: The study population included 22 patients and 43 TMJs suffering from unilateral or bilateral TMJ painful clicking, magnetic resonance imaging (MRI) proved disc displacement with reduction (DDWR), refractory to or failed conservative treatment. The patients were divided randomly into two groups (non-guided and ultrasound (US)-guided groups). The procedure involved injection of 2 mL solution of a mixture of 0.75 mL 0.9% normal saline solution, 0.3 mL 2% lidocaine and 0.75 mL dextrose 10% using a 25G needle in the joint and 1 mL intramuscular injection to the masseter muscle at the most tender point. The Visual Analogue Score (VAS) was used to compare joint pain intensity over four different periods, beginning with pre-injection, 1-, 2-, and 6-months postinjection. RESULTS: Twenty-two patients 5 males (n = 5/22, 22.7%) and 17 females (n = 17/22, 77.2%) were included in this study. The mean age was 27.3 ± 7.4 years (30.2 ± 7.0) for the non-guided group and 24.3 ± 6.9 for the US-guided group. The dextrose injection reduced intensity over time in both groups with statistically significant improvement (P value <.05) at 2 and 6 months in both groups. There was no statistically significant difference in VAS assessment between both groups. CONCLUSION: Intra-articular injection of dextrose 10% for patients with painful clicking and DDWR resulted in reduced pain intensity in both US-guided and non-guided groups with significant symptomatic improvement over time in both groups. US guidance allowed accurate anatomical localization and safe procedure with a single joint puncture.
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 128learning_rate
: 2e-05num_train_epochs
: 5max_steps
: 970330log_level
: infofp16
: Truedataloader_num_workers
: 16load_best_model_at_end
: Trueresume_from_checkpoint
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 128per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 5max_steps
: 970330lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: infolog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 16dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Truehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseeval_use_gather_object
: Falsebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.0000 | 1 | 5.9163 | - |
0.0052 | 1000 | 0.6371 | - |
0.0103 | 2000 | 0.1741 | - |
0.0155 | 3000 | 0.1606 | - |
0.0206 | 4000 | 0.1496 | - |
0.0258 | 5000 | 0.111 | - |
0.0309 | 6000 | 0.1198 | - |
0.0361 | 7000 | 0.1047 | - |
0.0412 | 8000 | 0.1002 | - |
0.0464 | 9000 | 0.0991 | - |
0.0515 | 10000 | 0.1142 | - |
0.0567 | 11000 | 0.1027 | - |
0.0618 | 12000 | 0.0752 | - |
0.0670 | 13000 | 0.1036 | - |
0.0721 | 14000 | 0.1114 | - |
0.0773 | 15000 | 0.0701 | - |
0.0824 | 16000 | 0.1623 | - |
0.0876 | 17000 | 0.0727 | - |
0.0928 | 18000 | 0.1121 | - |
0.0979 | 19000 | 0.0684 | - |
0.1031 | 20000 | 0.1327 | - |
0.1082 | 21000 | 0.0755 | - |
0.1134 | 22000 | 0.1293 | - |
0.1185 | 23000 | 0.0661 | - |
0.1237 | 24000 | 0.0797 | - |
0.1288 | 25000 | 0.063 | - |
0.1340 | 26000 | 0.1324 | - |
0.1391 | 27000 | 0.0708 | - |
0.1443 | 28000 | 0.1386 | - |
0.1494 | 29000 | 0.0691 | - |
0.1546 | 30000 | 0.143 | - |
0.1597 | 31000 | 0.0644 | - |
0.1649 | 32000 | 0.1065 | - |
0.1700 | 33000 | 0.1089 | - |
0.1752 | 34000 | 0.0737 | - |
0.1804 | 35000 | 0.1431 | - |
0.1855 | 36000 | 0.069 | - |
0.1907 | 37000 | 0.0756 | - |
0.1958 | 38000 | 0.1252 | - |
0.2010 | 39000 | 0.0768 | - |
0.2061 | 40000 | 0.1255 | - |
0.2113 | 41000 | 0.0849 | - |
0.2164 | 42000 | 0.072 | - |
0.2216 | 43000 | 0.1171 | - |
0.2267 | 44000 | 0.072 | - |
0.2319 | 45000 | 0.0818 | - |
0.2370 | 46000 | 0.0988 | - |
0.2422 | 47000 | 0.066 | - |
0.2473 | 48000 | 0.0829 | - |
0.2525 | 49000 | 0.0907 | - |
0.2576 | 50000 | 0.075 | - |
0.2628 | 51000 | 0.0653 | - |
0.2679 | 52000 | 0.0667 | - |
0.2731 | 53000 | 0.0906 | - |
0.2783 | 54000 | 0.0803 | - |
0.2834 | 55000 | 0.0568 | - |
0.2886 | 56000 | 0.0665 | - |
0.2937 | 57000 | 0.0629 | - |
0.2989 | 58000 | 0.0665 | - |
0.3040 | 59000 | 0.0601 | - |
0.3092 | 60000 | 0.0761 | - |
0.3143 | 61000 | 0.0472 | - |
0.3195 | 62000 | 0.0527 | - |
0.3246 | 63000 | 0.0585 | - |
0.3298 | 64000 | 0.0699 | - |
0.3349 | 65000 | 0.0687 | - |
0.3401 | 66000 | 0.0541 | - |
0.3452 | 67000 | 0.0413 | - |
0.3504 | 68000 | 0.055 | - |
0.3555 | 69000 | 0.0706 | - |
0.3607 | 70000 | 0.0805 | - |
0.3659 | 71000 | 0.0884 | - |
0.3710 | 72000 | 0.0359 | - |
0.3762 | 73000 | 0.0443 | - |
0.3813 | 74000 | 0.0543 | - |
0.3865 | 75000 | 0.045 | - |
0.3916 | 76000 | 0.1031 | - |
0.3968 | 77000 | 0.0535 | - |
0.4019 | 78000 | 0.0661 | - |
0.4071 | 79000 | 0.0353 | - |
0.4122 | 80000 | 0.06 | - |
0.4174 | 81000 | 0.0743 | - |
0.4225 | 82000 | 0.0439 | - |
0.4277 | 83000 | 0.0484 | - |
0.4328 | 84000 | 0.0643 | - |
0.4380 | 85000 | 0.0308 | - |
0.4431 | 86000 | 0.0412 | - |
0.4483 | 87000 | 0.0464 | - |
0.4535 | 88000 | 0.0713 | - |
0.4586 | 89000 | 0.0575 | - |
0.4638 | 90000 | 0.0461 | - |
0.4689 | 91000 | 0.0358 | - |
0.4741 | 92000 | 0.0413 | - |
0.4792 | 93000 | 0.0481 | - |
0.4844 | 94000 | 0.0473 | - |
0.4895 | 95000 | 0.082 | - |
0.4947 | 96000 | 0.0313 | - |
0.4998 | 97000 | 0.0365 | - |
0.5050 | 98000 | 0.0338 | - |
0.5101 | 99000 | 0.0364 | - |
0.5153 | 100000 | 0.0288 | - |
0.5204 | 101000 | 0.0421 | - |
0.5256 | 102000 | 0.0444 | - |
0.5307 | 103000 | 0.0242 | - |
0.5359 | 104000 | 0.0318 | - |
0.5411 | 105000 | 0.0285 | - |
0.5462 | 106000 | 0.0231 | - |
0.5514 | 107000 | 0.0534 | - |
0.5565 | 108000 | 0.0469 | - |
0.5617 | 109000 | 0.031 | - |
0.5668 | 110000 | 0.0376 | - |
0.5720 | 111000 | 0.0403 | - |
0.5771 | 112000 | 0.0408 | - |
0.5823 | 113000 | 0.0284 | - |
0.5874 | 114000 | 0.0344 | - |
0.5926 | 115000 | 0.0469 | - |
0.5977 | 116000 | 0.0304 | - |
0.6029 | 117000 | 0.0676 | - |
0.6080 | 118000 | 0.0396 | - |
0.6132 | 119000 | 0.0337 | - |
0.6183 | 120000 | 0.039 | - |
0.6235 | 121000 | 0.0286 | - |
0.6286 | 122000 | 0.0404 | - |
0.6338 | 123000 | 0.0383 | - |
0.6390 | 124000 | 0.046 | - |
0.6441 | 125000 | 0.0403 | - |
0.6493 | 126000 | 0.0502 | - |
0.6544 | 127000 | 0.0424 | - |
0.6596 | 128000 | 0.0424 | - |
0.6647 | 129000 | 0.0338 | - |
0.6699 | 130000 | 0.0262 | - |
0.6750 | 131000 | 0.0203 | - |
0.6802 | 132000 | 0.0405 | - |
0.6853 | 133000 | 0.0374 | - |
0.6905 | 134000 | 0.0329 | - |
0.6956 | 135000 | 0.0287 | - |
0.7008 | 136000 | 0.0366 | - |
0.7059 | 137000 | 0.0344 | - |
0.7111 | 138000 | 0.0402 | - |
0.7162 | 139000 | 0.0331 | - |
0.7214 | 140000 | 0.0404 | - |
0.7266 | 141000 | 0.0433 | - |
0.7317 | 142000 | 0.032 | - |
0.7369 | 143000 | 0.0281 | - |
0.7420 | 144000 | 0.0265 | - |
0.7472 | 145000 | 0.0282 | - |
0.7523 | 146000 | 0.0233 | - |
0.7575 | 147000 | 0.0291 | - |
0.7626 | 148000 | 0.0358 | - |
0.7678 | 149000 | 0.0343 | - |
0.7729 | 150000 | 0.0292 | - |
0.7781 | 151000 | 0.0359 | - |
0.7832 | 152000 | 0.0361 | - |
0.7884 | 153000 | 0.0289 | - |
0.7935 | 154000 | 0.0374 | - |
0.7987 | 155000 | 0.0341 | - |
0.8038 | 156000 | 0.0353 | - |
0.8090 | 157000 | 0.033 | - |
0.8142 | 158000 | 0.0291 | - |
0.8193 | 159000 | 0.0362 | - |
0.8245 | 160000 | 0.0355 | - |
0.8296 | 161000 | 0.026 | - |
0.8348 | 162000 | 0.0237 | - |
0.8399 | 163000 | 0.0175 | - |
0.8451 | 164000 | 0.0219 | - |
0.8502 | 165000 | 0.0227 | - |
0.8554 | 166000 | 0.0177 | - |
0.8605 | 167000 | 0.0239 | - |
0.8657 | 168000 | 0.0223 | - |
0.8708 | 169000 | 0.0219 | - |
0.8760 | 170000 | 0.0248 | - |
0.8811 | 171000 | 0.0237 | - |
0.8863 | 172000 | 0.0262 | - |
0.8914 | 173000 | 0.026 | - |
0.8966 | 174000 | 0.0228 | - |
0.9018 | 175000 | 0.0259 | - |
0.9069 | 176000 | 0.0232 | - |
0.9121 | 177000 | 0.0268 | - |
0.9172 | 178000 | 0.0228 | - |
0.9224 | 179000 | 0.0198 | - |
0.9275 | 180000 | 0.0183 | - |
0.9327 | 181000 | 0.022 | - |
0.9378 | 182000 | 0.0227 | - |
0.9430 | 183000 | 0.021 | - |
0.9481 | 184000 | 0.0211 | - |
0.9533 | 185000 | 0.0216 | - |
0.9584 | 186000 | 0.0209 | - |
0.9636 | 187000 | 0.0191 | - |
0.9687 | 188000 | 0.0188 | - |
0.9739 | 189000 | 0.0203 | - |
0.9790 | 190000 | 0.0203 | - |
0.9842 | 191000 | 0.0313 | - |
0.9893 | 192000 | 0.0213 | - |
0.9945 | 193000 | 0.0164 | - |
0.9997 | 194000 | 0.0181 | - |
1.0000 | 194066 | - | 0.0006 |
1.0048 | 195000 | 0.2333 | - |
1.0100 | 196000 | 0.0787 | - |
1.0151 | 197000 | 0.0848 | - |
1.0203 | 198000 | 0.0813 | - |
1.0254 | 199000 | 0.0624 | - |
1.0306 | 200000 | 0.0691 | - |
1.0357 | 201000 | 0.0596 | - |
1.0409 | 202000 | 0.0575 | - |
1.0460 | 203000 | 0.0573 | - |
1.0512 | 204000 | 0.0684 | - |
1.0563 | 205000 | 0.0507 | - |
1.0615 | 206000 | 0.0548 | - |
1.0666 | 207000 | 0.0607 | - |
1.0718 | 208000 | 0.0713 | - |
1.0769 | 209000 | 0.0425 | - |
1.0821 | 210000 | 0.113 | - |
1.0873 | 211000 | 0.0432 | - |
1.0924 | 212000 | 0.0727 | - |
1.0976 | 213000 | 0.0431 | - |
1.1027 | 214000 | 0.0909 | - |
1.1079 | 215000 | 0.0453 | - |
1.1130 | 216000 | 0.087 | - |
1.1182 | 217000 | 0.0442 | - |
1.1233 | 218000 | 0.0503 | - |
1.1285 | 219000 | 0.0413 | - |
1.1336 | 220000 | 0.0899 | - |
1.1388 | 221000 | 0.0463 | - |
1.1439 | 222000 | 0.0956 | - |
1.1491 | 223000 | 0.0452 | - |
1.1542 | 224000 | 0.098 | - |
1.1594 | 225000 | 0.0426 | - |
1.1645 | 226000 | 0.0597 | - |
1.1697 | 227000 | 0.085 | - |
1.1749 | 228000 | 0.0487 | - |
1.1800 | 229000 | 0.0984 | - |
1.1852 | 230000 | 0.0465 | - |
1.1903 | 231000 | 0.0492 | - |
1.1955 | 232000 | 0.0864 | - |
1.2006 | 233000 | 0.0489 | - |
1.2058 | 234000 | 0.0855 | - |
1.2109 | 235000 | 0.0579 | - |
1.2161 | 236000 | 0.0455 | - |
1.2212 | 237000 | 0.0811 | - |
1.2264 | 238000 | 0.0488 | - |
1.2315 | 239000 | 0.0547 | - |
1.2367 | 240000 | 0.0691 | - |
1.2418 | 241000 | 0.0426 | - |
1.2470 | 242000 | 0.0528 | - |
1.2521 | 243000 | 0.0552 | - |
1.2573 | 244000 | 0.0607 | - |
1.2625 | 245000 | 0.0421 | - |
1.2676 | 246000 | 0.0434 | - |
1.2728 | 247000 | 0.0632 | - |
1.2779 | 248000 | 0.0546 | - |
1.2831 | 249000 | 0.0375 | - |
1.2882 | 250000 | 0.038 | - |
1.2934 | 251000 | 0.0471 | - |
1.2985 | 252000 | 0.0441 | - |
1.3037 | 253000 | 0.0383 | - |
1.3088 | 254000 | 0.0521 | - |
1.3140 | 255000 | 0.033 | - |
1.3191 | 256000 | 0.0339 | - |
1.3243 | 257000 | 0.0363 | - |
1.3294 | 258000 | 0.0429 | - |
1.3346 | 259000 | 0.0523 | - |
1.3397 | 260000 | 0.0353 | - |
1.3449 | 261000 | 0.0271 | - |
1.3500 | 262000 | 0.0364 | - |
1.3552 | 263000 | 0.0477 | - |
1.3604 | 264000 | 0.0532 | - |
1.3655 | 265000 | 0.0595 | - |
1.3707 | 266000 | 0.0237 | - |
1.3758 | 267000 | 0.0239 | - |
1.3810 | 268000 | 0.0389 | - |
1.3861 | 269000 | 0.0288 | - |
1.3913 | 270000 | 0.0728 | - |
1.3964 | 271000 | 0.0365 | - |
1.4016 | 272000 | 0.038 | - |
1.4067 | 273000 | 0.0285 | - |
1.4119 | 274000 | 0.0246 | - |
1.4170 | 275000 | 0.0692 | - |
1.4222 | 276000 | 0.0281 | - |
1.4273 | 277000 | 0.0322 | - |
1.4325 | 278000 | 0.0451 | - |
1.4376 | 279000 | 0.0202 | - |
1.4428 | 280000 | 0.0274 | - |
1.4480 | 281000 | 0.0254 | - |
1.4531 | 282000 | 0.0539 | - |
1.4583 | 283000 | 0.0407 | - |
1.4634 | 284000 | 0.0286 | - |
1.4686 | 285000 | 0.0241 | - |
1.4737 | 286000 | 0.029 | - |
1.4789 | 287000 | 0.0337 | - |
1.4840 | 288000 | 0.0317 | - |
1.4892 | 289000 | 0.0406 | - |
1.4943 | 290000 | 0.0377 | - |
1.4995 | 291000 | 0.0238 | - |
1.5046 | 292000 | 0.0226 | - |
1.5098 | 293000 | 0.0233 | - |
1.5149 | 294000 | 0.0186 | - |
1.5201 | 295000 | 0.0255 | - |
1.5252 | 296000 | 0.0305 | - |
1.5304 | 297000 | 0.0154 | - |
1.5356 | 298000 | 0.0214 | - |
1.5407 | 299000 | 0.0187 | - |
1.5459 | 300000 | 0.0143 | - |
1.5510 | 301000 | 0.03 | - |
1.5562 | 302000 | 0.0346 | - |
1.5613 | 303000 | 0.0209 | - |
1.5665 | 304000 | 0.0206 | - |
1.5716 | 305000 | 0.0239 | - |
1.5768 | 306000 | 0.0262 | - |
1.5819 | 307000 | 0.0179 | - |
1.5871 | 308000 | 0.0198 | - |
1.5922 | 309000 | 0.0288 | - |
1.5974 | 310000 | 0.0192 | - |
1.6025 | 311000 | 0.0435 | - |
1.6077 | 312000 | 0.0251 | - |
1.6128 | 313000 | 0.0205 | - |
1.6180 | 314000 | 0.0246 | - |
1.6232 | 315000 | 0.0176 | - |
1.6283 | 316000 | 0.026 | - |
1.6335 | 317000 | 0.025 | - |
1.6386 | 318000 | 0.029 | - |
1.6438 | 319000 | 0.0274 | - |
1.6489 | 320000 | 0.0343 | - |
1.6541 | 321000 | 0.028 | - |
1.6592 | 322000 | 0.0282 | - |
1.6644 | 323000 | 0.0239 | - |
1.6695 | 324000 | 0.017 | - |
1.6747 | 325000 | 0.0132 | - |
1.6798 | 326000 | 0.0252 | - |
1.6850 | 327000 | 0.0243 | - |
1.6901 | 328000 | 0.0232 | - |
1.6953 | 329000 | 0.0183 | - |
1.7004 | 330000 | 0.0244 | - |
1.7056 | 331000 | 0.0239 | - |
1.7107 | 332000 | 0.0277 | - |
1.7159 | 333000 | 0.0223 | - |
1.7211 | 334000 | 0.0252 | - |
1.7262 | 335000 | 0.0302 | - |
1.7314 | 336000 | 0.0224 | - |
1.7365 | 337000 | 0.0188 | - |
1.7417 | 338000 | 0.0174 | - |
1.7468 | 339000 | 0.0189 | - |
1.7520 | 340000 | 0.0152 | - |
1.7571 | 341000 | 0.0185 | - |
1.7623 | 342000 | 0.024 | - |
1.7674 | 343000 | 0.0249 | - |
1.7726 | 344000 | 0.0202 | - |
1.7777 | 345000 | 0.0248 | - |
1.7829 | 346000 | 0.0256 | - |
1.7880 | 347000 | 0.022 | - |
1.7932 | 348000 | 0.0271 | - |
1.7983 | 349000 | 0.024 | - |
1.8035 | 350000 | 0.0241 | - |
1.8087 | 351000 | 0.0243 | - |
1.8138 | 352000 | 0.018 | - |
1.8190 | 353000 | 0.0236 | - |
1.8241 | 354000 | 0.0237 | - |
1.8293 | 355000 | 0.0175 | - |
1.8344 | 356000 | 0.015 | - |
1.8396 | 357000 | 0.0111 | - |
1.8447 | 358000 | 0.014 | - |
1.8499 | 359000 | 0.0146 | - |
1.8550 | 360000 | 0.0108 | - |
1.8602 | 361000 | 0.0157 | - |
1.8653 | 362000 | 0.0142 | - |
1.8705 | 363000 | 0.0129 | - |
1.8756 | 364000 | 0.0168 | - |
1.8808 | 365000 | 0.0155 | - |
1.8859 | 366000 | 0.017 | - |
1.8911 | 367000 | 0.0164 | - |
1.8963 | 368000 | 0.0156 | - |
1.9014 | 369000 | 0.0168 | - |
1.9066 | 370000 | 0.015 | - |
1.9117 | 371000 | 0.018 | - |
1.9169 | 372000 | 0.0151 | - |
1.9220 | 373000 | 0.0132 | - |
1.9272 | 374000 | 0.0117 | - |
1.9323 | 375000 | 0.0142 | - |
1.9375 | 376000 | 0.0151 | - |
1.9426 | 377000 | 0.0142 | - |
1.9478 | 378000 | 0.0139 | - |
1.9529 | 379000 | 0.014 | - |
1.9581 | 380000 | 0.0141 | - |
1.9632 | 381000 | 0.012 | - |
1.9684 | 382000 | 0.0127 | - |
1.9735 | 383000 | 0.0131 | - |
1.9787 | 384000 | 0.0131 | - |
1.9839 | 385000 | 0.0158 | - |
1.9890 | 386000 | 0.0211 | - |
1.9942 | 387000 | 0.011 | - |
1.9993 | 388000 | 0.0118 | - |
2.0000 | 388132 | - | 0.0005 |
2.0045 | 389000 | 0.1757 | - |
2.0096 | 390000 | 0.0632 | - |
2.0148 | 391000 | 0.0611 | - |
2.0199 | 392000 | 0.074 | - |
2.0251 | 393000 | 0.0492 | - |
2.0302 | 394000 | 0.0558 | - |
2.0354 | 395000 | 0.0474 | - |
2.0405 | 396000 | 0.0451 | - |
2.0457 | 397000 | 0.0451 | - |
2.0508 | 398000 | 0.0569 | - |
2.0560 | 399000 | 0.0395 | - |
2.0611 | 400000 | 0.0433 | - |
2.0663 | 401000 | 0.0475 | - |
2.0714 | 402000 | 0.0511 | - |
2.0766 | 403000 | 0.0403 | - |
2.0818 | 404000 | 0.0957 | - |
2.0869 | 405000 | 0.0319 | - |
2.0921 | 406000 | 0.0601 | - |
2.0972 | 407000 | 0.0333 | - |
2.1024 | 408000 | 0.0763 | - |
2.1075 | 409000 | 0.0339 | - |
2.1127 | 410000 | 0.072 | - |
2.1178 | 411000 | 0.0343 | - |
2.1230 | 412000 | 0.0398 | - |
2.1281 | 413000 | 0.0312 | - |
2.1333 | 414000 | 0.0756 | - |
2.1384 | 415000 | 0.0367 | - |
2.1436 | 416000 | 0.0797 | - |
2.1487 | 417000 | 0.0359 | - |
2.1539 | 418000 | 0.0812 | - |
2.1590 | 419000 | 0.0343 | - |
2.1642 | 420000 | 0.0431 | - |
2.1694 | 421000 | 0.0754 | - |
2.1745 | 422000 | 0.0384 | - |
2.1797 | 423000 | 0.0673 | - |
2.1848 | 424000 | 0.0514 | - |
2.1900 | 425000 | 0.0406 | - |
2.1951 | 426000 | 0.0703 | - |
2.2003 | 427000 | 0.0404 | - |
2.2054 | 428000 | 0.0699 | - |
2.2106 | 429000 | 0.0456 | - |
2.2157 | 430000 | 0.0375 | - |
2.2209 | 431000 | 0.0657 | - |
2.2260 | 432000 | 0.0386 | - |
2.2312 | 433000 | 0.044 | - |
2.2363 | 434000 | 0.0572 | - |
2.2415 | 435000 | 0.0334 | - |
2.2466 | 436000 | 0.0403 | - |
2.2518 | 437000 | 0.0446 | - |
2.2570 | 438000 | 0.0496 | - |
2.2621 | 439000 | 0.0326 | - |
2.2673 | 440000 | 0.0345 | - |
2.2724 | 441000 | 0.0489 | - |
2.2776 | 442000 | 0.0437 | - |
2.2827 | 443000 | 0.0297 | - |
2.2879 | 444000 | 0.0303 | - |
2.2930 | 445000 | 0.0377 | - |
2.2982 | 446000 | 0.0334 | - |
2.3033 | 447000 | 0.0297 | - |
2.3085 | 448000 | 0.0402 | - |
2.3136 | 449000 | 0.027 | - |
2.3188 | 450000 | 0.0266 | - |
2.3239 | 451000 | 0.0275 | - |
2.3291 | 452000 | 0.0327 | - |
2.3342 | 453000 | 0.0446 | - |
2.3394 | 454000 | 0.0261 | - |
2.3446 | 455000 | 0.0202 | - |
2.3497 | 456000 | 0.0286 | - |
2.3549 | 457000 | 0.0369 | - |
2.3600 | 458000 | 0.0416 | - |
2.3652 | 459000 | 0.0478 | - |
2.3703 | 460000 | 0.0177 | - |
2.3755 | 461000 | 0.0178 | - |
2.3806 | 462000 | 0.0294 | - |
2.3858 | 463000 | 0.0229 | - |
2.3909 | 464000 | 0.0602 | - |
2.3961 | 465000 | 0.0274 | - |
2.4012 | 466000 | 0.0223 | - |
2.4064 | 467000 | 0.0296 | - |
2.4115 | 468000 | 0.0182 | - |
2.4167 | 469000 | 0.0567 | - |
2.4218 | 470000 | 0.0199 | - |
2.4270 | 471000 | 0.0246 | - |
2.4321 | 472000 | 0.0382 | - |
2.4373 | 473000 | 0.0151 | - |
2.4425 | 474000 | 0.021 | - |
2.4476 | 475000 | 0.013 | - |
2.4528 | 476000 | 0.0472 | - |
2.4579 | 477000 | 0.034 | - |
2.4631 | 478000 | 0.0219 | - |
2.4682 | 479000 | 0.0186 | - |
2.4734 | 480000 | 0.0221 | - |
2.4785 | 481000 | 0.0241 | - |
2.4837 | 482000 | 0.0244 | - |
2.4888 | 483000 | 0.0288 | - |
2.4940 | 484000 | 0.0364 | - |
2.4991 | 485000 | 0.0178 | - |
2.5043 | 486000 | 0.0148 | - |
2.5094 | 487000 | 0.0198 | - |
2.5146 | 488000 | 0.0136 | - |
2.5197 | 489000 | 0.0191 | - |
2.5249 | 490000 | 0.0235 | - |
2.5301 | 491000 | 0.0115 | - |
2.5352 | 492000 | 0.0161 | - |
2.5404 | 493000 | 0.0139 | - |
2.5455 | 494000 | 0.01 | - |
2.5507 | 495000 | 0.022 | - |
2.5558 | 496000 | 0.0257 | - |
2.5610 | 497000 | 0.0177 | - |
2.5661 | 498000 | 0.0144 | - |
2.5713 | 499000 | 0.0174 | - |
2.5764 | 500000 | 0.0192 | - |
2.5816 | 501000 | 0.0129 | - |
2.5867 | 502000 | 0.0147 | - |
2.5919 | 503000 | 0.0212 | - |
2.5970 | 504000 | 0.0128 | - |
2.6022 | 505000 | 0.0349 | - |
2.6073 | 506000 | 0.0194 | - |
2.6125 | 507000 | 0.013 | - |
2.6177 | 508000 | 0.0196 | - |
2.6228 | 509000 | 0.0131 | - |
2.6280 | 510000 | 0.0191 | - |
2.6331 | 511000 | 0.019 | - |
2.6383 | 512000 | 0.0224 | - |
2.6434 | 513000 | 0.0197 | - |
2.6486 | 514000 | 0.0279 | - |
2.6537 | 515000 | 0.0213 | - |
2.6589 | 516000 | 0.0216 | - |
2.6640 | 517000 | 0.0182 | - |
2.6692 | 518000 | 0.0126 | - |
2.6743 | 519000 | 0.0102 | - |
2.6795 | 520000 | 0.0185 | - |
2.6846 | 521000 | 0.0181 | - |
2.6898 | 522000 | 0.0187 | - |
2.6949 | 523000 | 0.0135 | - |
2.7001 | 524000 | 0.0192 | - |
2.7053 | 525000 | 0.0179 | - |
2.7104 | 526000 | 0.0222 | - |
2.7156 | 527000 | 0.0179 | - |
2.7207 | 528000 | 0.0174 | - |
2.7259 | 529000 | 0.0254 | - |
2.7310 | 530000 | 0.0176 | - |
2.7362 | 531000 | 0.0146 | - |
2.7413 | 532000 | 0.0126 | - |
2.7465 | 533000 | 0.0143 | - |
2.7516 | 534000 | 0.012 | - |
2.7568 | 535000 | 0.0131 | - |
2.7619 | 536000 | 0.018 | - |
2.7671 | 537000 | 0.0205 | - |
2.7722 | 538000 | 0.0162 | - |
2.7774 | 539000 | 0.0201 | - |
2.7825 | 540000 | 0.0209 | - |
2.7877 | 541000 | 0.0168 | - |
2.7928 | 542000 | 0.0222 | - |
2.7980 | 543000 | 0.0193 | - |
2.8032 | 544000 | 0.0194 | - |
2.8083 | 545000 | 0.0196 | - |
2.8135 | 546000 | 0.0126 | - |
2.8186 | 547000 | 0.0177 | - |
2.8238 | 548000 | 0.018 | - |
2.8289 | 549000 | 0.0135 | - |
2.8341 | 550000 | 0.0107 | - |
2.8392 | 551000 | 0.0081 | - |
2.8444 | 552000 | 0.0107 | - |
2.8495 | 553000 | 0.0101 | - |
2.8547 | 554000 | 0.0081 | - |
2.8598 | 555000 | 0.0111 | - |
2.8650 | 556000 | 0.0105 | - |
2.8701 | 557000 | 0.0096 | - |
2.8753 | 558000 | 0.0121 | - |
2.8804 | 559000 | 0.0117 | - |
2.8856 | 560000 | 0.0119 | - |
2.8908 | 561000 | 0.0121 | - |
2.8959 | 562000 | 0.0118 | - |
2.9011 | 563000 | 0.0121 | - |
2.9062 | 564000 | 0.0115 | - |
2.9114 | 565000 | 0.0131 | - |
2.9165 | 566000 | 0.0117 | - |
2.9217 | 567000 | 0.0097 | - |
2.9268 | 568000 | 0.0084 | - |
2.9320 | 569000 | 0.0099 | - |
2.9371 | 570000 | 0.0109 | - |
2.9423 | 571000 | 0.0111 | - |
2.9474 | 572000 | 0.01 | - |
2.9526 | 573000 | 0.0099 | - |
2.9577 | 574000 | 0.0107 | - |
2.9629 | 575000 | 0.0085 | - |
2.9680 | 576000 | 0.0097 | - |
2.9732 | 577000 | 0.0092 | - |
2.9784 | 578000 | 0.0101 | - |
2.9835 | 579000 | 0.0111 | - |
2.9887 | 580000 | 0.0176 | - |
2.9938 | 581000 | 0.0081 | - |
2.9990 | 582000 | 0.0087 | - |
3.0000 | 582198 | - | 0.0005 |
3.0041 | 583000 | 0.1382 | - |
3.0093 | 584000 | 0.0599 | - |
3.0144 | 585000 | 0.0522 | - |
3.0196 | 586000 | 0.0627 | - |
3.0247 | 587000 | 0.0426 | - |
3.0299 | 588000 | 0.0473 | - |
3.0350 | 589000 | 0.0391 | - |
3.0402 | 590000 | 0.038 | - |
3.0453 | 591000 | 0.0376 | - |
3.0505 | 592000 | 0.0479 | - |
3.0556 | 593000 | 0.033 | - |
3.0608 | 594000 | 0.0361 | - |
3.0660 | 595000 | 0.0393 | - |
3.0711 | 596000 | 0.042 | - |
3.0763 | 597000 | 0.0365 | - |
3.0814 | 598000 | 0.0835 | - |
3.0866 | 599000 | 0.0265 | - |
3.0917 | 600000 | 0.0525 | - |
3.0969 | 601000 | 0.0264 | - |
3.1020 | 602000 | 0.0682 | - |
3.1072 | 603000 | 0.0273 | - |
3.1123 | 604000 | 0.0619 | - |
3.1175 | 605000 | 0.0292 | - |
3.1226 | 606000 | 0.0326 | - |
3.1278 | 607000 | 0.0254 | - |
3.1329 | 608000 | 0.0655 | - |
3.1381 | 609000 | 0.0306 | - |
3.1432 | 610000 | 0.0692 | - |
3.1484 | 611000 | 0.03 | - |
3.1536 | 612000 | 0.0708 | - |
3.1587 | 613000 | 0.0276 | - |
3.1639 | 614000 | 0.0278 | - |
3.1690 | 615000 | 0.0759 | - |
3.1742 | 616000 | 0.0316 | - |
3.1793 | 617000 | 0.0574 | - |
3.1845 | 618000 | 0.0441 | - |
3.1896 | 619000 | 0.0325 | - |
3.1948 | 620000 | 0.0613 | - |
3.1999 | 621000 | 0.0337 | - |
3.2051 | 622000 | 0.059 | - |
3.2102 | 623000 | 0.0373 | - |
3.2154 | 624000 | 0.033 | - |
3.2205 | 625000 | 0.0553 | - |
3.2257 | 626000 | 0.0316 | - |
3.2308 | 627000 | 0.0372 | - |
3.2360 | 628000 | 0.0469 | - |
3.2411 | 629000 | 0.0291 | - |
3.2463 | 630000 | 0.0319 | - |
3.2515 | 631000 | 0.0371 | - |
3.2566 | 632000 | 0.0436 | - |
3.2618 | 633000 | 0.0258 | - |
3.2669 | 634000 | 0.0287 | - |
3.2721 | 635000 | 0.0397 | - |
3.2772 | 636000 | 0.0379 | - |
3.2824 | 637000 | 0.0249 | - |
3.2875 | 638000 | 0.025 | - |
3.2927 | 639000 | 0.031 | - |
3.2978 | 640000 | 0.028 | - |
3.3030 | 641000 | 0.0244 | - |
3.3081 | 642000 | 0.0266 | - |
3.3133 | 643000 | 0.0287 | - |
3.3184 | 644000 | 0.0216 | - |
3.3236 | 645000 | 0.0217 | - |
3.3287 | 646000 | 0.0268 | - |
3.3339 | 647000 | 0.0379 | - |
3.3391 | 648000 | 0.0209 | - |
3.3442 | 649000 | 0.016 | - |
3.3494 | 650000 | 0.0227 | - |
3.3545 | 651000 | 0.0312 | - |
3.3597 | 652000 | 0.0335 | - |
3.3648 | 653000 | 0.041 | - |
3.3700 | 654000 | 0.0145 | - |
3.3751 | 655000 | 0.0148 | - |
3.3803 | 656000 | 0.0232 | - |
3.3854 | 657000 | 0.0185 | - |
3.3906 | 658000 | 0.0524 | - |
3.3957 | 659000 | 0.0213 | - |
3.4009 | 660000 | 0.0169 | - |
3.4060 | 661000 | 0.0267 | - |
3.4112 | 662000 | 0.014 | - |
3.4163 | 663000 | 0.0492 | - |
3.4215 | 664000 | 0.0154 | - |
3.4267 | 665000 | 0.0191 | - |
3.4318 | 666000 | 0.0335 | - |
3.4370 | 667000 | 0.0112 | - |
3.4421 | 668000 | 0.0168 | - |
3.4473 | 669000 | 0.0104 | - |
3.4524 | 670000 | 0.036 | - |
3.4576 | 671000 | 0.0304 | - |
3.4627 | 672000 | 0.0192 | - |
3.4679 | 673000 | 0.0151 | - |
3.4730 | 674000 | 0.0178 | - |
3.4782 | 675000 | 0.0196 | - |
3.4833 | 676000 | 0.02 | - |
3.4885 | 677000 | 0.0243 | - |
3.4936 | 678000 | 0.0311 | - |
3.4988 | 679000 | 0.014 | - |
3.5039 | 680000 | 0.0121 | - |
3.5091 | 681000 | 0.016 | - |
3.5143 | 682000 | 0.0101 | - |
3.5194 | 683000 | 0.0152 | - |
3.5246 | 684000 | 0.0192 | - |
3.5297 | 685000 | 0.0094 | - |
3.5349 | 686000 | 0.0126 | - |
3.5400 | 687000 | 0.0101 | - |
3.5452 | 688000 | 0.0084 | - |
3.5503 | 689000 | 0.0167 | - |
3.5555 | 690000 | 0.0205 | - |
3.5606 | 691000 | 0.0153 | - |
3.5658 | 692000 | 0.011 | - |
3.5709 | 693000 | 0.0136 | - |
3.5761 | 694000 | 0.0148 | - |
3.5812 | 695000 | 0.0105 | - |
3.5864 | 696000 | 0.011 | - |
3.5915 | 697000 | 0.0166 | - |
3.5967 | 698000 | 0.0101 | - |
3.6018 | 699000 | 0.028 | - |
3.6070 | 700000 | 0.0154 | - |
3.6122 | 701000 | 0.0102 | - |
3.6173 | 702000 | 0.0163 | - |
3.6225 | 703000 | 0.0104 | - |
3.6276 | 704000 | 0.0144 | - |
3.6328 | 705000 | 0.016 | - |
3.6379 | 706000 | 0.0176 | - |
3.6431 | 707000 | 0.0161 | - |
3.6482 | 708000 | 0.0228 | - |
3.6534 | 709000 | 0.017 | - |
3.6585 | 710000 | 0.018 | - |
3.6637 | 711000 | 0.0155 | - |
3.6688 | 712000 | 0.01 | - |
3.6740 | 713000 | 0.0083 | - |
3.6791 | 714000 | 0.0144 | - |
3.6843 | 715000 | 0.0152 | - |
3.6894 | 716000 | 0.0152 | - |
3.6946 | 717000 | 0.0115 | - |
3.6998 | 718000 | 0.0145 | - |
3.7049 | 719000 | 0.0149 | - |
3.7101 | 720000 | 0.0187 | - |
3.7152 | 721000 | 0.0142 | - |
3.7204 | 722000 | 0.0142 | - |
3.7255 | 723000 | 0.0213 | - |
3.7307 | 724000 | 0.0144 | - |
3.7358 | 725000 | 0.0118 | - |
3.7410 | 726000 | 0.0104 | - |
3.7461 | 727000 | 0.0114 | - |
3.7513 | 728000 | 0.0103 | - |
3.7564 | 729000 | 0.0093 | - |
3.7616 | 730000 | 0.0142 | - |
3.7667 | 731000 | 0.0179 | - |
3.7719 | 732000 | 0.0141 | - |
3.7770 | 733000 | 0.0163 | - |
3.7822 | 734000 | 0.018 | - |
3.7874 | 735000 | 0.0147 | - |
3.7925 | 736000 | 0.0184 | - |
3.7977 | 737000 | 0.0163 | - |
3.8028 | 738000 | 0.0164 | - |
3.8080 | 739000 | 0.0171 | - |
3.8131 | 740000 | 0.0101 | - |
3.8183 | 741000 | 0.0142 | - |
3.8234 | 742000 | 0.0146 | - |
3.8286 | 743000 | 0.011 | - |
3.8337 | 744000 | 0.0088 | - |
3.8389 | 745000 | 0.0062 | - |
3.8440 | 746000 | 0.0085 | - |
3.8492 | 747000 | 0.0077 | - |
3.8543 | 748000 | 0.0061 | - |
3.8595 | 749000 | 0.0086 | - |
3.8646 | 750000 | 0.0081 | - |
3.8698 | 751000 | 0.0076 | - |
3.8750 | 752000 | 0.0096 | - |
3.8801 | 753000 | 0.0093 | - |
3.8853 | 754000 | 0.0091 | - |
3.8904 | 755000 | 0.0094 | - |
3.8956 | 756000 | 0.0096 | - |
3.9007 | 757000 | 0.0096 | - |
3.9059 | 758000 | 0.0091 | - |
3.9110 | 759000 | 0.0103 | - |
3.9162 | 760000 | 0.0098 | - |
3.9213 | 761000 | 0.0079 | - |
3.9265 | 762000 | 0.0064 | - |
3.9316 | 763000 | 0.0075 | - |
3.9368 | 764000 | 0.0083 | - |
3.9419 | 765000 | 0.0093 | - |
3.9471 | 766000 | 0.008 | - |
3.9522 | 767000 | 0.0079 | - |
3.9574 | 768000 | 0.0081 | - |
3.9625 | 769000 | 0.0068 | - |
3.9677 | 770000 | 0.0075 | - |
3.9729 | 771000 | 0.0072 | - |
3.9780 | 772000 | 0.0081 | - |
3.9832 | 773000 | 0.0081 | - |
3.9883 | 774000 | 0.0157 | - |
3.9935 | 775000 | 0.0067 | - |
3.9986 | 776000 | 0.0067 | - |
4.0000 | 776264 | - | 0.0005 |
Framework Versions
- Python: 3.12.2
- Sentence Transformers: 3.2.1
- Transformers: 4.44.2
- PyTorch: 2.5.0
- Accelerate: 1.0.1
- Datasets: 3.0.2
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.