Embedding-v1 / README.md
Adi-0-0-Gupta's picture
Add new SentenceTransformer model.
2dbfe2b verified
metadata
datasets: []
language: []
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:14593
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
widget:
  - source_sentence: >-
      Macro ingredients needed to cook Poha: Orange Carrot, French Bean, Fresh
      Green Pea, Medium Poha, Red Onion, Curry Leaf, Green Chili Pepper
    sentences:
      - Can you list recipes that contain canned chickpea and canned black bean?
      - What are the leading macro ingredients in Pigeon Pea Curry (Toor Dal)?
      - What macro ingredients form the base of Poha?
  - source_sentence: >-
      I do have some good recommendations for you! Here are few good
      alternatives to kashmiri pulao:

      Kashmiri Dum Aloo, Shivani's Kashmiri Dum Aloo, Chicken Pulao, Chicken
      Rezala, Chicken Kheema Masala, Hyderabadi Chicken Masala, Masala Khichdi,
      Lentils and Rice (Dal Chawal), Homestyle Vegetable Pulao
    sentences:
      - What recipes are comparable to kashmiri pulao in flavor profile?
      - >-
        Can you give me step-by-step instructions to cook Hariyali Chicken
        Curry?
      - >-
        What are some recipes that utilize baking soda and olive oil
        effectively?
  - source_sentence: 'Garnishing tip for Yellow Rice: Sprinkle with chopped cilantro.'
    sentences:
      - How can I make Yellow Rice look appealing with garnishes?
      - Describe General Tso's Tofu for me.
      - What are the best garnishing tips for Paneer Tikka Masala?
  - source_sentence: >-
      Recipes that can be made using green chili pepper and grated coconut:
      Kerala Mix Vegetables (Aviyal), Carrot Poriyal, Cauliflower Poriyal,
      Beetroot Poriyal, Maithilee's Fish Curry, Mix Vegetable Poriyal, Ivy Gourd
      Curry (Tindora Masala), Spiced Indian Moth Beans (Matki Usal), Fish Curry,
      Andhra Garlic Chicken
    sentences:
      - What are the culinary uses of ground pork and chayote?
      - What are the dishes prepared using green cardamom and clove?
      - >-
        Can you suggest recipes that include green chili pepper and grated
        coconut?
  - source_sentence: >-
      Recipes that can be made using red onion and paprika: Breakfast Potatoes
      with Sausage, Peri Peri Chicken Pasta, Scrambled Egg Curry, Chili Mac &
      Cheese, Tomato Chicken Curry
    sentences:
      - >-
        Are there dishes that closely resemble spiced potatoes & fenugreek (aloo
        methi)?
      - >-
        What recipes incorporate black pepper and habanero chili in their
        ingredients?
      - What are some ways to use red onion and paprika in recipes?
model-index:
  - name: SentenceTransformer
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 384
          type: dim_384
        metrics:
          - type: cosine_accuracy@1
            value: 0.9704069050554871
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9926017262638718
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.998766954377312
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9993834771886559
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.9704069050554871
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.33086724208795726
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1997533908754624
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09993834771886559
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.9704069050554871
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9926017262638718
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.998766954377312
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9993834771886559
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9865445143406266
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.9822089131583582
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.9822089131583582
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.9728729963008631
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9932182490752158
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.998766954377312
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9993834771886559
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.9728729963008631
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.3310727496917386
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1997533908754624
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09993834771886559
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.9728729963008631
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9932182490752158
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.998766954377312
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9993834771886559
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9875922381599775
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.9836107685984382
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.9836107685984381
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.9722564734895192
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9944512946979038
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9993834771886559
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9993834771886559
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.9722564734895192
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.33148376489930126
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.19987669543773118
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09993834771886559
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.9722564734895192
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9944512946979038
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9993834771886559
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9993834771886559
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9873346466071089
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.9832511302918208
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.9832511302918209
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.9704069050554871
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9944512946979038
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9993834771886559
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9993834771886559
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.9704069050554871
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.33148376489930126
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.19987669543773118
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09993834771886559
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.9704069050554871
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9944512946979038
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9993834771886559
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9993834771886559
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9867057287670639
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.9823982737361283
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.9823982737361281
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 32
          type: dim_32
        metrics:
          - type: cosine_accuracy@1
            value: 0.971023427866831
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9950678175092479
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9993834771886559
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9993834771886559
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.971023427866831
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.3316892725030826
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.19987669543773118
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09993834771886559
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.971023427866831
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9950678175092479
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9993834771886559
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9993834771886559
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9872988931953259
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.9831689272503082
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.9831689272503081
            name: Cosine Map@100

SentenceTransformer

This is a sentence-transformers model trained. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 384 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Adi-0-0-Gupta/Embedding-v1")
# Run inference
sentences = [
    'Recipes that can be made using red onion and paprika: Breakfast Potatoes with Sausage, Peri Peri Chicken Pasta, Scrambled Egg Curry, Chili Mac & Cheese, Tomato Chicken Curry',
    'What are some ways to use red onion and paprika in recipes?',
    'Are there dishes that closely resemble spiced potatoes & fenugreek (aloo methi)?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.9704
cosine_accuracy@3 0.9926
cosine_accuracy@5 0.9988
cosine_accuracy@10 0.9994
cosine_precision@1 0.9704
cosine_precision@3 0.3309
cosine_precision@5 0.1998
cosine_precision@10 0.0999
cosine_recall@1 0.9704
cosine_recall@3 0.9926
cosine_recall@5 0.9988
cosine_recall@10 0.9994
cosine_ndcg@10 0.9865
cosine_mrr@10 0.9822
cosine_map@100 0.9822

Information Retrieval

Metric Value
cosine_accuracy@1 0.9729
cosine_accuracy@3 0.9932
cosine_accuracy@5 0.9988
cosine_accuracy@10 0.9994
cosine_precision@1 0.9729
cosine_precision@3 0.3311
cosine_precision@5 0.1998
cosine_precision@10 0.0999
cosine_recall@1 0.9729
cosine_recall@3 0.9932
cosine_recall@5 0.9988
cosine_recall@10 0.9994
cosine_ndcg@10 0.9876
cosine_mrr@10 0.9836
cosine_map@100 0.9836

Information Retrieval

Metric Value
cosine_accuracy@1 0.9723
cosine_accuracy@3 0.9945
cosine_accuracy@5 0.9994
cosine_accuracy@10 0.9994
cosine_precision@1 0.9723
cosine_precision@3 0.3315
cosine_precision@5 0.1999
cosine_precision@10 0.0999
cosine_recall@1 0.9723
cosine_recall@3 0.9945
cosine_recall@5 0.9994
cosine_recall@10 0.9994
cosine_ndcg@10 0.9873
cosine_mrr@10 0.9833
cosine_map@100 0.9833

Information Retrieval

Metric Value
cosine_accuracy@1 0.9704
cosine_accuracy@3 0.9945
cosine_accuracy@5 0.9994
cosine_accuracy@10 0.9994
cosine_precision@1 0.9704
cosine_precision@3 0.3315
cosine_precision@5 0.1999
cosine_precision@10 0.0999
cosine_recall@1 0.9704
cosine_recall@3 0.9945
cosine_recall@5 0.9994
cosine_recall@10 0.9994
cosine_ndcg@10 0.9867
cosine_mrr@10 0.9824
cosine_map@100 0.9824

Information Retrieval

Metric Value
cosine_accuracy@1 0.971
cosine_accuracy@3 0.9951
cosine_accuracy@5 0.9994
cosine_accuracy@10 0.9994
cosine_precision@1 0.971
cosine_precision@3 0.3317
cosine_precision@5 0.1999
cosine_precision@10 0.0999
cosine_recall@1 0.971
cosine_recall@3 0.9951
cosine_recall@5 0.9994
cosine_recall@10 0.9994
cosine_ndcg@10 0.9873
cosine_mrr@10 0.9832
cosine_map@100 0.9832

Training Details

Training Dataset

Unnamed Dataset

  • Size: 14,593 training samples
  • Columns: positive and anchor
  • Approximate statistics based on the first 1000 samples:
    positive anchor
    type string string
    details
    • min: 11 tokens
    • mean: 53.46 tokens
    • max: 512 tokens
    • min: 7 tokens
    • mean: 15.83 tokens
    • max: 32 tokens
  • Samples:
    positive anchor
    Calories information of Hyderabadi Chicken Masala, based on different serving sizes: Serving 1 - 345 calories, Serving 2 - 580 calories, Serving 3 - 1220 calories, Serving 4 - 1450 calories What’s the calorie content of Hyderabadi Chicken Masala?
    Recipes that can be made using dried herb mix and onion powder: Chorizo Queso Soup, Cheesy Chicken & Broccoli What are some food items made using dried herb mix and onion powder?
    Recipes that can be made using roasted semolina/bombay rava and saffron: Rashmi's Kesari Bath, Pineapple Kesari Bath What recipes have roasted semolina/bombay rava and saffron in them?
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            384,
            256,
            128,
            64,
            32
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • gradient_accumulation_steps: 16
  • learning_rate: 1e-05
  • num_train_epochs: 20
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • tf32: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • learning_rate: 1e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 20
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_128_cosine_map@100 dim_256_cosine_map@100 dim_32_cosine_map@100 dim_384_cosine_map@100 dim_64_cosine_map@100
0.3501 10 0.0066 - - - - -
0.7002 20 0.0056 - - - - -
0.9803 28 - 0.9746 0.9771 0.9776 0.9758 0.9763
1.0503 30 0.0057 - - - - -
1.4004 40 0.0048 - - - - -
1.7505 50 0.0039 - - - - -
1.9956 57 - 0.9783 0.9787 0.9815 0.9788 0.9793
2.1007 60 0.0046 - - - - -
2.4508 70 0.0035 - - - - -
2.8009 80 0.0028 - - - - -
2.9759 85 - 0.9818 0.9811 0.9836 0.9803 0.9823
3.1510 90 0.0036 - - - - -
3.5011 100 0.0033 - - - - -
3.8512 110 0.0026 - - - - -
3.9912 114 - 0.9814 0.9818 0.9844 0.9814 0.9821
4.2013 120 0.0025 - - - - -
4.5514 130 0.003 - - - - -
4.9015 140 0.0027 - - - - -
4.9716 142 - 0.9825 0.9819 0.9844 0.9823 0.9825
5.2516 150 0.0024 - - - - -
5.6018 160 0.0023 - - - - -
5.9519 170 0.0024 - - - - -
5.9869 171 - 0.9831 0.9826 0.9846 0.9818 0.9831
6.3020 180 0.0025 - - - - -
6.6521 190 0.0025 - - - - -
6.9672 199 - 0.9830 0.9825 0.9844 0.9823 0.9831
7.0022 200 0.0019 - - - - -
7.3523 210 0.0022 - - - - -
7.7024 220 0.0026 - - - - -
7.9825 228 - 0.9828 0.9825 0.9836 0.9821 0.9821
8.0525 230 0.0022 - - - - -
8.4026 240 0.0021 - - - - -
8.7527 250 0.0021 - - - - -
8.9978 257 - 0.9827 0.9826 0.9848 0.9827 0.9827
9.1028 260 0.0025 - - - - -
9.4530 270 0.0022 - - - - -
9.8031 280 0.0019 - - - - -
9.9781 285 - 0.9832 0.9833 0.9858 0.9825 0.9834
10.1532 290 0.0021 - - - - -
10.5033 300 0.0019 - - - - -
10.8534 310 0.0024 - - - - -
10.9934 314 - 0.9830 0.9827 0.9850 0.9825 0.9829
11.2035 320 0.0017 - - - - -
11.5536 330 0.0017 - - - - -
11.9037 340 0.0018 - - - - -
11.9737 342 - 0.9827 0.9835 0.9841 0.9826 0.9827
12.2538 350 0.0018 - - - - -
12.6039 360 0.0018 - - - - -
12.9540 370 0.0023 - - - - -
12.9891 371 - 0.9828 0.9834 0.9832 0.9826 0.9823
13.3042 380 0.0017 - - - - -
13.6543 390 0.0018 - - - - -
13.9694 399 - 0.9830 0.9831 0.9838 0.9820 0.9826
14.0044 400 0.0016 - - - - -
14.3545 410 0.0018 - - - - -
14.7046 420 0.0018 - - - - -
14.9847 428 - 0.9827 0.9825 0.9832 0.9816 0.9826
15.0547 430 0.0018 - - - - -
15.4048 440 0.0015 - - - - -
15.7549 450 0.0017 - - - - -
16.0 457 - 0.9833 0.9836 0.9832 0.9822 0.9824

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.1.2+cu121
  • Accelerate: 0.31.0
  • Datasets: 2.19.1
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning}, 
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}