pavanmantha's picture
Add new SentenceTransformer model.
bc8fc43 verified
metadata
language:
  - en
license: apache-2.0
library_name: sentence-transformers
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:4247
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
base_model: BAAI/bge-base-en-v1.5
datasets: []
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
widget:
  - source_sentence: >-
      Perry syndrome is a familial parkinsonism associated with central
      hypoventilation, mental depression, and weight loss.
    sentences:
      - List features of the Perry syndrome.
      - Which is the main abnormality that arises with Sox9 locus duplication?
      - Was modafinil tested for schizophrenia treatment?
  - source_sentence: >-
      Yes. HDAC1 is required for GATA-1 transcription activity, global chromatin
      occupancy and hematopoiesis.
    sentences:
      - Is HDAC1 required for GATA-1 transcriptional activity?
      - Which cells are affected in radiation-induced leukemias?
      - Is phospholamban phosphorylated by Protein kinase A?
  - source_sentence: >-
      Long noncoding RNAs (lncRNAs) constitute the majority of transcripts in
      the mammalian genomes, and yet, their functions remain largely unknown. As
      part of the FANTOM6 project, the expression of 285 lncRNAs was
      systematically knocked down in human dermal fibroblasts. Cellular growth,
      morphological changes, and transcriptomic responses were quantified using
      Capped Analysis of Gene Expression (CAGE).The functional annotation of the
      mammalian genome 6 (FANTOM6) project aims to systematically map all human
      long noncoding RNAs (lncRNAs) in a gene-dependent manner through dedicated
      efforts from national and international teams
    sentences:
      - What delivery system is used for the Fluzone Intradermal vaccine?
      - What is dovitinib?
      - >-
        Which class of genomic elements was assessed as part of the FANTOM6
        project?
  - source_sentence: ' The proband had normal molecular analysis of the glypican 6 gene (GPC6), which was recently reported as a candidate for autosomal recessive omodysplasiaThe proband had normal molecular analysis of the glypican 6 gene (GPC6), which was recently reported as a candidate for autosomal recessive omodysplasiaThe glypican 6 gene (GPC6), which was recently reported as a candidate for autosomal recessive omodysplasia.Omodysplasia is a rare autosomal recessive disorder with a frequency of 1 in 50,000 newborn, and is associated with mutations in the GPC6 gene on chromosome 13.'
    sentences:
      - >-
        What is the effect of ivabradine in heart failure with preserved
        ejection fraction?
      - >-
        What rare disease is associated with a mutation in the GPC6 gene on
        chromosome 13?
      - What is the effect of rHDL-apoE3 on endothelial cell migration?
  - source_sentence: >-
      Yes, numerous whole exome sequencing studies of ALzheimer patients have
      been conducted.
    sentences:
      - >-
        Is muscle regeneration possible in mdx mice with the use of induced
        mesenchymal stem cells?
      - Has whole exome sequencing been performed in Alzheimer patients?
      - >-
        How is connected "isolated Non-compaction cardiomyopathy" with dilated
        cardiomyopathy?
pipeline_tag: sentence-similarity
model-index:
  - name: BGE base BioASQ Matryoshka
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.8516949152542372
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.940677966101695
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9576271186440678
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.961864406779661
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.8516949152542372
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.31355932203389825
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.19152542372881357
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09618644067796611
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.8516949152542372
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.940677966101695
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9576271186440678
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.961864406779661
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9149563623470877
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8990348399246703
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8999167242053622
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.8516949152542372
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9449152542372882
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9555084745762712
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9597457627118644
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.8516949152542372
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.3149717514124293
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.19110169491525428
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09597457627118645
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.8516949152542372
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9449152542372882
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9555084745762712
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9597457627118644
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9136223756024043
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8979166666666664
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8990624087448101
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.8389830508474576
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.934322033898305
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9470338983050848
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9597457627118644
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.8389830508474576
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.3114406779661017
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.189406779661017
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09597457627118645
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.8389830508474576
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.934322033898305
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9470338983050848
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9597457627118644
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9053426368336166
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8872721616895344
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8879933659912613
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.8241525423728814
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9110169491525424
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9322033898305084
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9470338983050848
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.8241525423728814
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.30367231638418074
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1864406779661017
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09470338983050848
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.8241525423728814
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9110169491525424
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9322033898305084
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9470338983050848
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8905411432220106
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8719422585418346
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8732028981082185
            name: Cosine Map@100

BGE base BioASQ Matryoshka

This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-base-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("pavanmantha/bge-base-en-bioembed")
# Run inference
sentences = [
    'Yes, numerous whole exome sequencing studies of ALzheimer patients have been conducted.',
    'Has whole exome sequencing been performed in Alzheimer patients?',
    'How is connected "isolated Non-compaction cardiomyopathy" with dilated cardiomyopathy?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.8517
cosine_accuracy@3 0.9407
cosine_accuracy@5 0.9576
cosine_accuracy@10 0.9619
cosine_precision@1 0.8517
cosine_precision@3 0.3136
cosine_precision@5 0.1915
cosine_precision@10 0.0962
cosine_recall@1 0.8517
cosine_recall@3 0.9407
cosine_recall@5 0.9576
cosine_recall@10 0.9619
cosine_ndcg@10 0.915
cosine_mrr@10 0.899
cosine_map@100 0.8999

Information Retrieval

Metric Value
cosine_accuracy@1 0.8517
cosine_accuracy@3 0.9449
cosine_accuracy@5 0.9555
cosine_accuracy@10 0.9597
cosine_precision@1 0.8517
cosine_precision@3 0.315
cosine_precision@5 0.1911
cosine_precision@10 0.096
cosine_recall@1 0.8517
cosine_recall@3 0.9449
cosine_recall@5 0.9555
cosine_recall@10 0.9597
cosine_ndcg@10 0.9136
cosine_mrr@10 0.8979
cosine_map@100 0.8991

Information Retrieval

Metric Value
cosine_accuracy@1 0.839
cosine_accuracy@3 0.9343
cosine_accuracy@5 0.947
cosine_accuracy@10 0.9597
cosine_precision@1 0.839
cosine_precision@3 0.3114
cosine_precision@5 0.1894
cosine_precision@10 0.096
cosine_recall@1 0.839
cosine_recall@3 0.9343
cosine_recall@5 0.947
cosine_recall@10 0.9597
cosine_ndcg@10 0.9053
cosine_mrr@10 0.8873
cosine_map@100 0.888

Information Retrieval

Metric Value
cosine_accuracy@1 0.8242
cosine_accuracy@3 0.911
cosine_accuracy@5 0.9322
cosine_accuracy@10 0.947
cosine_precision@1 0.8242
cosine_precision@3 0.3037
cosine_precision@5 0.1864
cosine_precision@10 0.0947
cosine_recall@1 0.8242
cosine_recall@3 0.911
cosine_recall@5 0.9322
cosine_recall@10 0.947
cosine_ndcg@10 0.8905
cosine_mrr@10 0.8719
cosine_map@100 0.8732

Training Details

Training Dataset

Unnamed Dataset

  • Size: 4,247 training samples
  • Columns: positive and anchor
  • Approximate statistics based on the first 1000 samples:
    positive anchor
    type string string
    details
    • min: 4 tokens
    • mean: 103.25 tokens
    • max: 512 tokens
    • min: 6 tokens
    • mean: 15.94 tokens
    • max: 49 tokens
  • Samples:
    positive anchor
    Yes, saracatinib is being studied as a treatment against Alzheimer's Disease. A clinical Phase Ib study has been completed, and a clinical Phase IIa study is ongoing. Was saracatinib being considered as a treatment for Alzheimer's disease in November 2017?
    TREM2 variants have been found to be associated with early as well as with late onset Alzheimer's disease. Is TREM2 associated with Alzheimer's disease in humans?
    Yes, siltuximab , a chimeric human-mouse monoclonal antibody to IL6, is approved for the treatment of patients with multicentric Castleman disease who are human immunodeficiency virus negative and human herpesvirus-8 negative. Is siltuximab effective for Castleman disease?
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 16
  • learning_rate: 2e-05
  • num_train_epochs: 4
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • fp16: True
  • tf32: False
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 4
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: False
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_128_cosine_map@100 dim_256_cosine_map@100 dim_512_cosine_map@100 dim_768_cosine_map@100
0.9624 8 - 0.8794 0.8937 0.9044 0.9018
1.2030 10 1.1405 - - - -
1.9248 16 - 0.8739 0.8866 0.8998 0.8984
2.4060 20 0.4328 - - - -
2.8872 24 - 0.8732 0.8876 0.8987 0.8998
3.6090 30 0.312 - - - -
3.8496 32 - 0.8732 0.8880 0.8991 0.8999
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.13
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.1.2
  • Accelerate: 0.31.0
  • Datasets: 2.19.1
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning}, 
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}