SentenceTransformer

This is a sentence-transformers model trained on the parquet dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Maximum Sequence Length: 1024 tokens
  • Output Dimensionality: 384 tokens
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • parquet

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("pankajrajdeo/UMLS-Pubmed-TCE-Epoch-3")
# Run inference
sentences = [
    '[YEAR_RANGE] 2020-2024 [TEXT] Intraoperative Monitoring of the External Urethral Sphincter Reflex: A Novel Adjunct to Bulbocavernosus Reflex Neuromonitoring for Protecting the Sacral Neural Pathways Responsible for Urination, Defecation and Sexual Function.',
    '[YEAR_RANGE] 2020-2024 [TEXT] PURPOSE: Intraoperative bulbocavernosus reflex neuromonitoring has been utilized to protect bowel, bladder, and sexual function, providing a continuous functional assessment of the somatic sacral nervous system during surgeries where it is at risk. Bulbocavernosus reflex data may also provide additional functional insight, including an evaluation for spinal shock, distinguishing upper versus lower motor neuron injury (conus versus cauda syndromes) and prognosis for postoperative bowel and bladder function. Continuous intraoperative bulbocavernosus reflex monitoring has been utilized to provide the surgeon with an ongoing functional assessment of the anatomical elements involved in the S2-S4 mediated reflex arc including the conus, cauda equina and pudendal nerves. Intraoperative bulbocavernosus reflex monitoring typically includes the electrical activation of the dorsal nerves of the genitals to initiate the afferent component of the reflex, followed by recording the resulting muscle response using needle electromyography recordings from the external anal sphincter. METHODS: Herein we describe a complementary and novel technique that includes recording electromyography responses from the external urethral sphincter to monitor the external urethral sphincter reflex. Specialized foley catheters embedded with recording electrodes have recently become commercially available that provide the ability to perform intraoperative external urethral sphincter muscle recordings. RESULTS: We describe technical details and the potential utility of incorporating external urethral sphincter reflex recordings into existing sacral neuromonitoring paradigms to provide redundant yet complementary data streams. CONCLUSIONS: We present two illustrative neurosurgical oncology cases to demonstrate the utility of the external urethral sphincter reflex technique in the setting of the necessary surgical sacrifice of sacral nerve roots.',
    '[YEAR_RANGE] 2020-2024 [TEXT] Early menarche has been associated with adverse health outcomes, such as depressive symptoms. Discovering effect modifiers across these conditions in the pediatric population is a constant challenge. We tested whether movement behaviours modified the effect of the association between early menarche and depression symptoms among adolescents. This cross-sectional study included 2031 females aged 15-19 years across all Brazilian geographic regions. Data were collected using a self-administered questionnaire; 30.5% (n = 620) reported having experienced menarche before age 12 years (that is, early menarche). We used the Patient Health Questionnaire (PHQ-9) to evaluate depressive symptoms. Accruing any moderate-vigorous physical activity during leisure time, limited recreational screen time, and having good sleep quality were the exposures investigated. Adolescents who experienced early menarche and met one (B: -4.45, 95% CI: (-5.38, -3.51)), two (B: -6.07 (-7.02, -5.12)), or three (B: -6.49 (-7.76, -5.21)), and adolescents who experienced not early menarche and met one (B: -5.33 (-6.20; -4.46)), two (B: -6.12 (-6.99; -5.24)), or three (B: -6.27 (-7.30; -5.24)) of the movement behaviour targets had lower PHQ-9 scores for depression symptoms than adolescents who experienced early menarche and did not meet any of the movement behaviours. The disparities in depressive symptoms among the adolescents (early menarche versus not early menarche) who adhered to all three target behaviours were not statistically significant (B: 0.41 (-0.19; 1.01)). Adherence to movement behaviours modified the effect of the association between early menarche and depression symptoms.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

parquet

  • Dataset: parquet
  • Size: 26,147,930 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 28 tokens
    • mean: 57.85 tokens
    • max: 149 tokens
    • min: 43 tokens
    • mean: 319.08 tokens
    • max: 1024 tokens
  • Samples:
    anchor positive
    [YEAR_RANGE] 1880-1884 [TEXT] ADDRESS OF COL. GARRICK MALLERY, U. S. ARMY. [YEAR_RANGE] 1880-1884 [TEXT] It may be conceded that after man had all his present faculties, he did not choose between the adoption of voice and gesture, and never with those faculties, was in a state where the one was used, to the absolute exclusion of the other. The epoch, however, to which our speculations relate is that in which he had not reached the present symmetric development of his intellect and of his bodily organs, and the inquiry is: Which mode of communication was earliest adopted to his single wants and informed intelligence? With the voice he could imitate distinictively but few sounds of nature, while with gesture he could exhibit actions, motions, positions, forms, dimensions, directions and distances, with their derivations and analogues. It would seem from this unequal division of capacity that oral speech remained rudimentary long after gesture had become an efficient mode of communication. With due allowance for all purely imitative sounds, and for the spontaneous action of vocal organs under excitement, it appears that the connection between ideas and words is only to be explained by a compact between speaker and hearer which supposes the existence of a prior mode of communication. This was probably by gesture. At least we may accept it as a clew leading out of the labyrinth of philological confusion, and regulating the immemorial quest of man's primitive speech.
    [YEAR_RANGE] 1880-1884 [TEXT] How TO OBTAIN THE BRAIN OF THE CAT. [YEAR_RANGE] 1880-1884 [TEXT] How to obtain the Brain of the Cat, (Wilder).-Correction: Page 158, second column, line 7, "grains," should be "grams;" page 159, near middle of 2nd column, "successily," should be "successively;" page 161, the number of Flower's paper is 3.
    [YEAR_RANGE] 1880-1884 [TEXT] DOLBEAR ON THE NATURE AND CONSTITUTION OF MATTER. [YEAR_RANGE] 1880-1884 [TEXT] Mr. Dopp desires to make the following correction in his paper in the last issue: "In my article on page 200 of "Science", the expression and should have been and being the velocity of light.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

parquet

  • Dataset: parquet
  • Size: 26,147,930 evaluation samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 27 tokens
    • mean: 43.78 tokens
    • max: 90 tokens
    • min: 28 tokens
    • mean: 315.03 tokens
    • max: 847 tokens
  • Samples:
    anchor positive
    [YEAR_RANGE] 2020-2024 [TEXT] Solubility and thermodynamics of mesalazine in aqueous mixtures of poly ethylene glycol 200/600 at 293.2-313.2K. [YEAR_RANGE] 2020-2024 [TEXT] In this study, the solubility of mesalazine was investigated in binary solvent mixtures of poly ethylene glycols 200/600 and water at temperatures ranging from 293.2K to 313.2K. The solubility of mesalazine was determined using a shake-flask method, and its concentrations were measured using a UV-Vis spectrophotometer. The obtained solubility data were analyzed using mathematical models including the van't Hoff, Jouyban-Acree, Jouyban-Acree-van't Hoff, mixture response surface, and modified Wilson models. The experimental data obtained for mesalazine dissolution encompassed various thermodynamic properties, including ΔG°, ΔH°, ΔS°, and TΔS°. These properties offer valuable insights into the energetic aspects of the dissolution process and were calculated based on the van't Hoff equation.
    [YEAR_RANGE] 2020-2024 [TEXT] Safety and efficacy of remimazolam versus propofol during EUS: a multicenter randomized controlled study. [YEAR_RANGE] 2020-2024 [TEXT] BACKGROUND AND AIMS: Propofol, a widely used sedative in GI endoscopic procedures, is associated with cardiorespiratory suppression. Remimazolam is a novel ultrashort-acting benzodiazepine sedative with rapid onset and minimal cardiorespiratory depression. This study compared the safety and efficacy of remimazolam and propofol during EUS procedures. METHODS: A multicenter randomized controlled study was conducted between October 2022 and March 2023 in patients who underwent EUS procedures. Patients were randomly assigned to receive either remimazolam or propofol as a sedative agent. The primary endpoint was cardiorespiratory adverse events.
    [YEAR_RANGE] 2020-2024 [TEXT] Ultrasound-Guided Vs Non-Guided Prolotherapy for Internal Derangement of Temporomandibular Joint. A Randomized Clinical Trial. [YEAR_RANGE] 2020-2024 [TEXT] OBJECTIVES: This randomized clinical trial study aims to compare ultrasound-guided versus non-guided Dextrose 10% injections in patients suffering from internal derangement in the temporomandibular joint (TMJ). MATERIAL AND METHODS: The study population included 22 patients and 43 TMJs suffering from unilateral or bilateral TMJ painful clicking, magnetic resonance imaging (MRI) proved disc displacement with reduction (DDWR), refractory to or failed conservative treatment. The patients were divided randomly into two groups (non-guided and ultrasound (US)-guided groups). The procedure involved injection of 2 mL solution of a mixture of 0.75 mL 0.9% normal saline solution, 0.3 mL 2% lidocaine and 0.75 mL dextrose 10% using a 25G needle in the joint and 1 mL intramuscular injection to the masseter muscle at the most tender point. The Visual Analogue Score (VAS) was used to compare joint pain intensity over four different periods, beginning with pre-injection, 1-, 2-, and 6-months postinjection. RESULTS: Twenty-two patients 5 males (n = 5/22, 22.7%) and 17 females (n = 17/22, 77.2%) were included in this study. The mean age was 27.3 ± 7.4 years (30.2 ± 7.0) for the non-guided group and 24.3 ± 6.9 for the US-guided group. The dextrose injection reduced intensity over time in both groups with statistically significant improvement (P value <.05) at 2 and 6 months in both groups. There was no statistically significant difference in VAS assessment between both groups. CONCLUSION: Intra-articular injection of dextrose 10% for patients with painful clicking and DDWR resulted in reduced pain intensity in both US-guided and non-guided groups with significant symptomatic improvement over time in both groups. US guidance allowed accurate anatomical localization and safe procedure with a single joint puncture.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 5
  • max_steps: 970330
  • log_level: info
  • fp16: True
  • dataloader_num_workers: 16
  • load_best_model_at_end: True
  • resume_from_checkpoint: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: 970330
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: info
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 16
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: True
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss
0.0000 1 5.9163 -
0.0052 1000 0.6371 -
0.0103 2000 0.1741 -
0.0155 3000 0.1606 -
0.0206 4000 0.1496 -
0.0258 5000 0.111 -
0.0309 6000 0.1198 -
0.0361 7000 0.1047 -
0.0412 8000 0.1002 -
0.0464 9000 0.0991 -
0.0515 10000 0.1142 -
0.0567 11000 0.1027 -
0.0618 12000 0.0752 -
0.0670 13000 0.1036 -
0.0721 14000 0.1114 -
0.0773 15000 0.0701 -
0.0824 16000 0.1623 -
0.0876 17000 0.0727 -
0.0928 18000 0.1121 -
0.0979 19000 0.0684 -
0.1031 20000 0.1327 -
0.1082 21000 0.0755 -
0.1134 22000 0.1293 -
0.1185 23000 0.0661 -
0.1237 24000 0.0797 -
0.1288 25000 0.063 -
0.1340 26000 0.1324 -
0.1391 27000 0.0708 -
0.1443 28000 0.1386 -
0.1494 29000 0.0691 -
0.1546 30000 0.143 -
0.1597 31000 0.0644 -
0.1649 32000 0.1065 -
0.1700 33000 0.1089 -
0.1752 34000 0.0737 -
0.1804 35000 0.1431 -
0.1855 36000 0.069 -
0.1907 37000 0.0756 -
0.1958 38000 0.1252 -
0.2010 39000 0.0768 -
0.2061 40000 0.1255 -
0.2113 41000 0.0849 -
0.2164 42000 0.072 -
0.2216 43000 0.1171 -
0.2267 44000 0.072 -
0.2319 45000 0.0818 -
0.2370 46000 0.0988 -
0.2422 47000 0.066 -
0.2473 48000 0.0829 -
0.2525 49000 0.0907 -
0.2576 50000 0.075 -
0.2628 51000 0.0653 -
0.2679 52000 0.0667 -
0.2731 53000 0.0906 -
0.2783 54000 0.0803 -
0.2834 55000 0.0568 -
0.2886 56000 0.0665 -
0.2937 57000 0.0629 -
0.2989 58000 0.0665 -
0.3040 59000 0.0601 -
0.3092 60000 0.0761 -
0.3143 61000 0.0472 -
0.3195 62000 0.0527 -
0.3246 63000 0.0585 -
0.3298 64000 0.0699 -
0.3349 65000 0.0687 -
0.3401 66000 0.0541 -
0.3452 67000 0.0413 -
0.3504 68000 0.055 -
0.3555 69000 0.0706 -
0.3607 70000 0.0805 -
0.3659 71000 0.0884 -
0.3710 72000 0.0359 -
0.3762 73000 0.0443 -
0.3813 74000 0.0543 -
0.3865 75000 0.045 -
0.3916 76000 0.1031 -
0.3968 77000 0.0535 -
0.4019 78000 0.0661 -
0.4071 79000 0.0353 -
0.4122 80000 0.06 -
0.4174 81000 0.0743 -
0.4225 82000 0.0439 -
0.4277 83000 0.0484 -
0.4328 84000 0.0643 -
0.4380 85000 0.0308 -
0.4431 86000 0.0412 -
0.4483 87000 0.0464 -
0.4535 88000 0.0713 -
0.4586 89000 0.0575 -
0.4638 90000 0.0461 -
0.4689 91000 0.0358 -
0.4741 92000 0.0413 -
0.4792 93000 0.0481 -
0.4844 94000 0.0473 -
0.4895 95000 0.082 -
0.4947 96000 0.0313 -
0.4998 97000 0.0365 -
0.5050 98000 0.0338 -
0.5101 99000 0.0364 -
0.5153 100000 0.0288 -
0.5204 101000 0.0421 -
0.5256 102000 0.0444 -
0.5307 103000 0.0242 -
0.5359 104000 0.0318 -
0.5411 105000 0.0285 -
0.5462 106000 0.0231 -
0.5514 107000 0.0534 -
0.5565 108000 0.0469 -
0.5617 109000 0.031 -
0.5668 110000 0.0376 -
0.5720 111000 0.0403 -
0.5771 112000 0.0408 -
0.5823 113000 0.0284 -
0.5874 114000 0.0344 -
0.5926 115000 0.0469 -
0.5977 116000 0.0304 -
0.6029 117000 0.0676 -
0.6080 118000 0.0396 -
0.6132 119000 0.0337 -
0.6183 120000 0.039 -
0.6235 121000 0.0286 -
0.6286 122000 0.0404 -
0.6338 123000 0.0383 -
0.6390 124000 0.046 -
0.6441 125000 0.0403 -
0.6493 126000 0.0502 -
0.6544 127000 0.0424 -
0.6596 128000 0.0424 -
0.6647 129000 0.0338 -
0.6699 130000 0.0262 -
0.6750 131000 0.0203 -
0.6802 132000 0.0405 -
0.6853 133000 0.0374 -
0.6905 134000 0.0329 -
0.6956 135000 0.0287 -
0.7008 136000 0.0366 -
0.7059 137000 0.0344 -
0.7111 138000 0.0402 -
0.7162 139000 0.0331 -
0.7214 140000 0.0404 -
0.7266 141000 0.0433 -
0.7317 142000 0.032 -
0.7369 143000 0.0281 -
0.7420 144000 0.0265 -
0.7472 145000 0.0282 -
0.7523 146000 0.0233 -
0.7575 147000 0.0291 -
0.7626 148000 0.0358 -
0.7678 149000 0.0343 -
0.7729 150000 0.0292 -
0.7781 151000 0.0359 -
0.7832 152000 0.0361 -
0.7884 153000 0.0289 -
0.7935 154000 0.0374 -
0.7987 155000 0.0341 -
0.8038 156000 0.0353 -
0.8090 157000 0.033 -
0.8142 158000 0.0291 -
0.8193 159000 0.0362 -
0.8245 160000 0.0355 -
0.8296 161000 0.026 -
0.8348 162000 0.0237 -
0.8399 163000 0.0175 -
0.8451 164000 0.0219 -
0.8502 165000 0.0227 -
0.8554 166000 0.0177 -
0.8605 167000 0.0239 -
0.8657 168000 0.0223 -
0.8708 169000 0.0219 -
0.8760 170000 0.0248 -
0.8811 171000 0.0237 -
0.8863 172000 0.0262 -
0.8914 173000 0.026 -
0.8966 174000 0.0228 -
0.9018 175000 0.0259 -
0.9069 176000 0.0232 -
0.9121 177000 0.0268 -
0.9172 178000 0.0228 -
0.9224 179000 0.0198 -
0.9275 180000 0.0183 -
0.9327 181000 0.022 -
0.9378 182000 0.0227 -
0.9430 183000 0.021 -
0.9481 184000 0.0211 -
0.9533 185000 0.0216 -
0.9584 186000 0.0209 -
0.9636 187000 0.0191 -
0.9687 188000 0.0188 -
0.9739 189000 0.0203 -
0.9790 190000 0.0203 -
0.9842 191000 0.0313 -
0.9893 192000 0.0213 -
0.9945 193000 0.0164 -
0.9997 194000 0.0181 -
1.0000 194066 - 0.0006
1.0048 195000 0.2333 -
1.0100 196000 0.0787 -
1.0151 197000 0.0848 -
1.0203 198000 0.0813 -
1.0254 199000 0.0624 -
1.0306 200000 0.0691 -
1.0357 201000 0.0596 -
1.0409 202000 0.0575 -
1.0460 203000 0.0573 -
1.0512 204000 0.0684 -
1.0563 205000 0.0507 -
1.0615 206000 0.0548 -
1.0666 207000 0.0607 -
1.0718 208000 0.0713 -
1.0769 209000 0.0425 -
1.0821 210000 0.113 -
1.0873 211000 0.0432 -
1.0924 212000 0.0727 -
1.0976 213000 0.0431 -
1.1027 214000 0.0909 -
1.1079 215000 0.0453 -
1.1130 216000 0.087 -
1.1182 217000 0.0442 -
1.1233 218000 0.0503 -
1.1285 219000 0.0413 -
1.1336 220000 0.0899 -
1.1388 221000 0.0463 -
1.1439 222000 0.0956 -
1.1491 223000 0.0452 -
1.1542 224000 0.098 -
1.1594 225000 0.0426 -
1.1645 226000 0.0597 -
1.1697 227000 0.085 -
1.1749 228000 0.0487 -
1.1800 229000 0.0984 -
1.1852 230000 0.0465 -
1.1903 231000 0.0492 -
1.1955 232000 0.0864 -
1.2006 233000 0.0489 -
1.2058 234000 0.0855 -
1.2109 235000 0.0579 -
1.2161 236000 0.0455 -
1.2212 237000 0.0811 -
1.2264 238000 0.0488 -
1.2315 239000 0.0547 -
1.2367 240000 0.0691 -
1.2418 241000 0.0426 -
1.2470 242000 0.0528 -
1.2521 243000 0.0552 -
1.2573 244000 0.0607 -
1.2625 245000 0.0421 -
1.2676 246000 0.0434 -
1.2728 247000 0.0632 -
1.2779 248000 0.0546 -
1.2831 249000 0.0375 -
1.2882 250000 0.038 -
1.2934 251000 0.0471 -
1.2985 252000 0.0441 -
1.3037 253000 0.0383 -
1.3088 254000 0.0521 -
1.3140 255000 0.033 -
1.3191 256000 0.0339 -
1.3243 257000 0.0363 -
1.3294 258000 0.0429 -
1.3346 259000 0.0523 -
1.3397 260000 0.0353 -
1.3449 261000 0.0271 -
1.3500 262000 0.0364 -
1.3552 263000 0.0477 -
1.3604 264000 0.0532 -
1.3655 265000 0.0595 -
1.3707 266000 0.0237 -
1.3758 267000 0.0239 -
1.3810 268000 0.0389 -
1.3861 269000 0.0288 -
1.3913 270000 0.0728 -
1.3964 271000 0.0365 -
1.4016 272000 0.038 -
1.4067 273000 0.0285 -
1.4119 274000 0.0246 -
1.4170 275000 0.0692 -
1.4222 276000 0.0281 -
1.4273 277000 0.0322 -
1.4325 278000 0.0451 -
1.4376 279000 0.0202 -
1.4428 280000 0.0274 -
1.4480 281000 0.0254 -
1.4531 282000 0.0539 -
1.4583 283000 0.0407 -
1.4634 284000 0.0286 -
1.4686 285000 0.0241 -
1.4737 286000 0.029 -
1.4789 287000 0.0337 -
1.4840 288000 0.0317 -
1.4892 289000 0.0406 -
1.4943 290000 0.0377 -
1.4995 291000 0.0238 -
1.5046 292000 0.0226 -
1.5098 293000 0.0233 -
1.5149 294000 0.0186 -
1.5201 295000 0.0255 -
1.5252 296000 0.0305 -
1.5304 297000 0.0154 -
1.5356 298000 0.0214 -
1.5407 299000 0.0187 -
1.5459 300000 0.0143 -
1.5510 301000 0.03 -
1.5562 302000 0.0346 -
1.5613 303000 0.0209 -
1.5665 304000 0.0206 -
1.5716 305000 0.0239 -
1.5768 306000 0.0262 -
1.5819 307000 0.0179 -
1.5871 308000 0.0198 -
1.5922 309000 0.0288 -
1.5974 310000 0.0192 -
1.6025 311000 0.0435 -
1.6077 312000 0.0251 -
1.6128 313000 0.0205 -
1.6180 314000 0.0246 -
1.6232 315000 0.0176 -
1.6283 316000 0.026 -
1.6335 317000 0.025 -
1.6386 318000 0.029 -
1.6438 319000 0.0274 -
1.6489 320000 0.0343 -
1.6541 321000 0.028 -
1.6592 322000 0.0282 -
1.6644 323000 0.0239 -
1.6695 324000 0.017 -
1.6747 325000 0.0132 -
1.6798 326000 0.0252 -
1.6850 327000 0.0243 -
1.6901 328000 0.0232 -
1.6953 329000 0.0183 -
1.7004 330000 0.0244 -
1.7056 331000 0.0239 -
1.7107 332000 0.0277 -
1.7159 333000 0.0223 -
1.7211 334000 0.0252 -
1.7262 335000 0.0302 -
1.7314 336000 0.0224 -
1.7365 337000 0.0188 -
1.7417 338000 0.0174 -
1.7468 339000 0.0189 -
1.7520 340000 0.0152 -
1.7571 341000 0.0185 -
1.7623 342000 0.024 -
1.7674 343000 0.0249 -
1.7726 344000 0.0202 -
1.7777 345000 0.0248 -
1.7829 346000 0.0256 -
1.7880 347000 0.022 -
1.7932 348000 0.0271 -
1.7983 349000 0.024 -
1.8035 350000 0.0241 -
1.8087 351000 0.0243 -
1.8138 352000 0.018 -
1.8190 353000 0.0236 -
1.8241 354000 0.0237 -
1.8293 355000 0.0175 -
1.8344 356000 0.015 -
1.8396 357000 0.0111 -
1.8447 358000 0.014 -
1.8499 359000 0.0146 -
1.8550 360000 0.0108 -
1.8602 361000 0.0157 -
1.8653 362000 0.0142 -
1.8705 363000 0.0129 -
1.8756 364000 0.0168 -
1.8808 365000 0.0155 -
1.8859 366000 0.017 -
1.8911 367000 0.0164 -
1.8963 368000 0.0156 -
1.9014 369000 0.0168 -
1.9066 370000 0.015 -
1.9117 371000 0.018 -
1.9169 372000 0.0151 -
1.9220 373000 0.0132 -
1.9272 374000 0.0117 -
1.9323 375000 0.0142 -
1.9375 376000 0.0151 -
1.9426 377000 0.0142 -
1.9478 378000 0.0139 -
1.9529 379000 0.014 -
1.9581 380000 0.0141 -
1.9632 381000 0.012 -
1.9684 382000 0.0127 -
1.9735 383000 0.0131 -
1.9787 384000 0.0131 -
1.9839 385000 0.0158 -
1.9890 386000 0.0211 -
1.9942 387000 0.011 -
1.9993 388000 0.0118 -
2.0000 388132 - 0.0005
2.0045 389000 0.1757 -
2.0096 390000 0.0632 -
2.0148 391000 0.0611 -
2.0199 392000 0.074 -
2.0251 393000 0.0492 -
2.0302 394000 0.0558 -
2.0354 395000 0.0474 -
2.0405 396000 0.0451 -
2.0457 397000 0.0451 -
2.0508 398000 0.0569 -
2.0560 399000 0.0395 -
2.0611 400000 0.0433 -
2.0663 401000 0.0475 -
2.0714 402000 0.0511 -
2.0766 403000 0.0403 -
2.0818 404000 0.0957 -
2.0869 405000 0.0319 -
2.0921 406000 0.0601 -
2.0972 407000 0.0333 -
2.1024 408000 0.0763 -
2.1075 409000 0.0339 -
2.1127 410000 0.072 -
2.1178 411000 0.0343 -
2.1230 412000 0.0398 -
2.1281 413000 0.0312 -
2.1333 414000 0.0756 -
2.1384 415000 0.0367 -
2.1436 416000 0.0797 -
2.1487 417000 0.0359 -
2.1539 418000 0.0812 -
2.1590 419000 0.0343 -
2.1642 420000 0.0431 -
2.1694 421000 0.0754 -
2.1745 422000 0.0384 -
2.1797 423000 0.0673 -
2.1848 424000 0.0514 -
2.1900 425000 0.0406 -
2.1951 426000 0.0703 -
2.2003 427000 0.0404 -
2.2054 428000 0.0699 -
2.2106 429000 0.0456 -
2.2157 430000 0.0375 -
2.2209 431000 0.0657 -
2.2260 432000 0.0386 -
2.2312 433000 0.044 -
2.2363 434000 0.0572 -
2.2415 435000 0.0334 -
2.2466 436000 0.0403 -
2.2518 437000 0.0446 -
2.2570 438000 0.0496 -
2.2621 439000 0.0326 -
2.2673 440000 0.0345 -
2.2724 441000 0.0489 -
2.2776 442000 0.0437 -
2.2827 443000 0.0297 -
2.2879 444000 0.0303 -
2.2930 445000 0.0377 -
2.2982 446000 0.0334 -
2.3033 447000 0.0297 -
2.3085 448000 0.0402 -
2.3136 449000 0.027 -
2.3188 450000 0.0266 -
2.3239 451000 0.0275 -
2.3291 452000 0.0327 -
2.3342 453000 0.0446 -
2.3394 454000 0.0261 -
2.3446 455000 0.0202 -
2.3497 456000 0.0286 -
2.3549 457000 0.0369 -
2.3600 458000 0.0416 -
2.3652 459000 0.0478 -
2.3703 460000 0.0177 -
2.3755 461000 0.0178 -
2.3806 462000 0.0294 -
2.3858 463000 0.0229 -
2.3909 464000 0.0602 -
2.3961 465000 0.0274 -
2.4012 466000 0.0223 -
2.4064 467000 0.0296 -
2.4115 468000 0.0182 -
2.4167 469000 0.0567 -
2.4218 470000 0.0199 -
2.4270 471000 0.0246 -
2.4321 472000 0.0382 -
2.4373 473000 0.0151 -
2.4425 474000 0.021 -
2.4476 475000 0.013 -
2.4528 476000 0.0472 -
2.4579 477000 0.034 -
2.4631 478000 0.0219 -
2.4682 479000 0.0186 -
2.4734 480000 0.0221 -
2.4785 481000 0.0241 -
2.4837 482000 0.0244 -
2.4888 483000 0.0288 -
2.4940 484000 0.0364 -
2.4991 485000 0.0178 -
2.5043 486000 0.0148 -
2.5094 487000 0.0198 -
2.5146 488000 0.0136 -
2.5197 489000 0.0191 -
2.5249 490000 0.0235 -
2.5301 491000 0.0115 -
2.5352 492000 0.0161 -
2.5404 493000 0.0139 -
2.5455 494000 0.01 -
2.5507 495000 0.022 -
2.5558 496000 0.0257 -
2.5610 497000 0.0177 -
2.5661 498000 0.0144 -
2.5713 499000 0.0174 -
2.5764 500000 0.0192 -
2.5816 501000 0.0129 -
2.5867 502000 0.0147 -
2.5919 503000 0.0212 -
2.5970 504000 0.0128 -
2.6022 505000 0.0349 -
2.6073 506000 0.0194 -
2.6125 507000 0.013 -
2.6177 508000 0.0196 -
2.6228 509000 0.0131 -
2.6280 510000 0.0191 -
2.6331 511000 0.019 -
2.6383 512000 0.0224 -
2.6434 513000 0.0197 -
2.6486 514000 0.0279 -
2.6537 515000 0.0213 -
2.6589 516000 0.0216 -
2.6640 517000 0.0182 -
2.6692 518000 0.0126 -
2.6743 519000 0.0102 -
2.6795 520000 0.0185 -
2.6846 521000 0.0181 -
2.6898 522000 0.0187 -
2.6949 523000 0.0135 -
2.7001 524000 0.0192 -
2.7053 525000 0.0179 -
2.7104 526000 0.0222 -
2.7156 527000 0.0179 -
2.7207 528000 0.0174 -
2.7259 529000 0.0254 -
2.7310 530000 0.0176 -
2.7362 531000 0.0146 -
2.7413 532000 0.0126 -
2.7465 533000 0.0143 -
2.7516 534000 0.012 -
2.7568 535000 0.0131 -
2.7619 536000 0.018 -
2.7671 537000 0.0205 -
2.7722 538000 0.0162 -
2.7774 539000 0.0201 -
2.7825 540000 0.0209 -
2.7877 541000 0.0168 -
2.7928 542000 0.0222 -
2.7980 543000 0.0193 -
2.8032 544000 0.0194 -
2.8083 545000 0.0196 -
2.8135 546000 0.0126 -
2.8186 547000 0.0177 -
2.8238 548000 0.018 -
2.8289 549000 0.0135 -
2.8341 550000 0.0107 -
2.8392 551000 0.0081 -
2.8444 552000 0.0107 -
2.8495 553000 0.0101 -
2.8547 554000 0.0081 -
2.8598 555000 0.0111 -
2.8650 556000 0.0105 -
2.8701 557000 0.0096 -
2.8753 558000 0.0121 -
2.8804 559000 0.0117 -
2.8856 560000 0.0119 -
2.8908 561000 0.0121 -
2.8959 562000 0.0118 -
2.9011 563000 0.0121 -
2.9062 564000 0.0115 -
2.9114 565000 0.0131 -
2.9165 566000 0.0117 -
2.9217 567000 0.0097 -
2.9268 568000 0.0084 -
2.9320 569000 0.0099 -
2.9371 570000 0.0109 -
2.9423 571000 0.0111 -
2.9474 572000 0.01 -
2.9526 573000 0.0099 -
2.9577 574000 0.0107 -
2.9629 575000 0.0085 -
2.9680 576000 0.0097 -
2.9732 577000 0.0092 -
2.9784 578000 0.0101 -
2.9835 579000 0.0111 -
2.9887 580000 0.0176 -
2.9938 581000 0.0081 -
2.9990 582000 0.0087 -
3.0000 582198 - 0.0005

Framework Versions

  • Python: 3.12.2
  • Sentence Transformers: 3.2.1
  • Transformers: 4.44.2
  • PyTorch: 2.5.0
  • Accelerate: 1.0.1
  • Datasets: 3.0.2
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
15
Safetensors
Model size
41.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.