youssefkhalil320's picture
Upload folder using huggingface_hub
3d03dff verified
metadata
base_model: sentence-transformers/all-MiniLM-L12-v2
language:
  - en
library_name: sentence-transformers
license: apache-2.0
metrics:
  - pearson_cosine
  - spearman_cosine
  - pearson_manhattan
  - spearman_manhattan
  - pearson_euclidean
  - spearman_euclidean
  - pearson_dot
  - spearman_dot
  - pearson_max
  - spearman_max
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:510287
  - loss:CoSENTLoss
widget:
  - source_sentence: bag
    sentences:
      - bag
      - summer colors bag
      - carry all bag
  - source_sentence: bean bag
    sentences:
      - bag
      - havan bag
      - black yellow shoes
  - source_sentence: pyramid shaped cushion mattress
    sentences:
      - dress
      - silver bag
      - women shoes
  - source_sentence: handcrafted rug
    sentences:
      - amaga  cross bag - white
      - handcrafted boots
      - polyester top
  - source_sentence: bean bag
    sentences:
      - bag
      - v-neck dress
      - bag
model-index:
  - name: all-MiniLM-L12-v2-pair_score
    results:
      - task:
          type: semantic-similarity
          name: Semantic Similarity
        dataset:
          name: sts dev
          type: sts-dev
        metrics:
          - type: pearson_cosine
            value: -0.10403022864037037
            name: Pearson Cosine
          - type: spearman_cosine
            value: -0.1437799564130218
            name: Spearman Cosine
          - type: pearson_manhattan
            value: -0.10847915569723102
            name: Pearson Manhattan
          - type: spearman_manhattan
            value: -0.14274368509273366
            name: Spearman Manhattan
          - type: pearson_euclidean
            value: -0.11064121359722408
            name: Pearson Euclidean
          - type: spearman_euclidean
            value: -0.14377964610318103
            name: Spearman Euclidean
          - type: pearson_dot
            value: -0.10403015819885228
            name: Pearson Dot
          - type: spearman_dot
            value: -0.14377961300118045
            name: Spearman Dot
          - type: pearson_max
            value: -0.10403015819885228
            name: Pearson Max
          - type: spearman_max
            value: -0.14274368509273366
            name: Spearman Max

all-MiniLM-L12-v2-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L12-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L12-v2
  • Maximum Sequence Length: 128 tokens
  • Output Dimensionality: 384 tokens
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'bean bag',
    'bag',
    'v-neck dress',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Semantic Similarity

Metric Value
pearson_cosine -0.104
spearman_cosine -0.1438
pearson_manhattan -0.1085
spearman_manhattan -0.1427
pearson_euclidean -0.1106
spearman_euclidean -0.1438
pearson_dot -0.104
spearman_dot -0.1438
pearson_max -0.104
spearman_max -0.1427

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • learning_rate: 2e-05
  • num_train_epochs: 4
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 4
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss loss sts-dev_spearman_cosine
0 0 - - -0.1438
0.0063 100 11.9171 - -
0.0125 200 11.0074 - -
0.0188 300 10.1073 - -
0.0251 400 8.6232 - -
0.0314 500 7.5947 7.2720 -
0.0376 600 6.3883 - -
0.0439 700 5.6165 - -
0.0502 800 4.8254 - -
0.0564 900 4.5595 - -
0.0627 1000 4.2965 4.1720 -
0.0690 1100 4.063 - -
0.0752 1200 4.0861 - -
0.0815 1300 3.9703 - -
0.0878 1400 3.8222 - -
0.0941 1500 3.927 3.6404 -
0.1003 1600 3.6892 - -
0.1066 1700 3.9166 - -
0.1129 1800 3.7162 - -
0.1191 1900 3.4866 - -
0.1254 2000 3.5202 3.4226 -
0.1317 2100 3.6876 - -
0.1380 2200 3.4884 - -
0.1442 2300 3.4407 - -
0.1505 2400 3.2658 - -
0.1568 2500 3.2973 3.0777 -
0.1630 2600 3.2087 - -
0.1693 2700 3.4316 - -
0.1756 2800 3.3372 - -
0.1819 2900 3.161 - -
0.1881 3000 3.0232 2.8805 -
0.1944 3100 3.2897 - -
0.2007 3200 3.2576 - -
0.2069 3300 2.7636 - -
0.2132 3400 3.1788 - -
0.2195 3500 2.6269 2.6237 -
0.2257 3600 2.9352 - -
0.2320 3700 2.847 - -
0.2383 3800 2.8001 - -
0.2446 3900 2.6048 - -
0.2508 4000 2.5976 2.5250 -
0.2571 4100 2.5211 - -
0.2634 4200 2.7812 - -
0.2696 4300 2.6822 - -
0.2759 4400 2.4779 - -
0.2822 4500 2.6242 2.6365 -
0.2885 4600 2.5655 - -
0.2947 4700 2.9998 - -
0.3010 4800 2.679 - -
0.3073 4900 2.5719 - -
0.3135 5000 2.6913 2.6934 -
0.3198 5100 2.8346 - -
0.3261 5200 2.7453 - -
0.3324 5300 2.4492 - -
0.3386 5400 2.9389 - -
0.3449 5500 2.6002 2.6182 -
0.3512 5600 2.2592 - -
0.3574 5700 2.3822 - -
0.3637 5800 2.4771 - -
0.3700 5900 3.5914 - -
0.3762 6000 2.3525 2.5605 -
0.3825 6100 2.2667 - -
0.3888 6200 2.4671 - -
0.3951 6300 2.6816 - -
0.4013 6400 2.2303 - -
0.4076 6500 2.3153 2.4245 -
0.4139 6600 2.7969 - -
0.4201 6700 2.61 - -
0.4264 6800 2.5267 - -
0.4327 6900 2.532 - -
0.4390 7000 2.6088 2.4666 -
0.4452 7100 1.848 - -
0.4515 7200 2.1369 - -
0.4578 7300 2.185 - -
0.4640 7400 2.0279 - -
0.4703 7500 2.5593 2.3958 -
0.4766 7600 2.339 - -
0.4828 7700 2.2122 - -
0.4891 7800 2.7878 - -
0.4954 7900 2.3005 - -
0.5017 8000 2.2922 2.5408 -
0.5079 8100 2.3731 - -
0.5142 8200 2.1879 - -
0.5205 8300 2.1598 - -
0.5267 8400 2.2292 - -
0.5330 8500 1.958 2.0935 -
0.5393 8600 2.1152 - -
0.5456 8700 1.9725 - -
0.5518 8800 2.1106 - -
0.5581 8900 2.06 - -
0.5644 9000 1.7624 2.1509 -
0.5706 9100 2.3793 - -
0.5769 9200 1.9322 - -
0.5832 9300 1.8355 - -
0.5895 9400 2.1425 - -
0.5957 9500 2.2191 1.9984 -
0.6020 9600 2.3245 - -
0.6083 9700 2.1206 - -
0.6145 9800 2.0957 - -
0.6208 9900 2.5276 - -
0.6271 10000 1.5383 1.9509 -
0.6333 10100 2.111 - -
0.6396 10200 1.893 - -
0.6459 10300 1.8961 - -
0.6522 10400 1.6599 - -
0.6584 10500 2.3409 1.8286 -
0.6647 10600 1.9741 - -
0.6710 10700 2.0438 - -
0.6772 10800 1.814 - -
0.6835 10900 2.1819 - -
0.6898 11000 1.8547 1.9461 -
0.6961 11100 2.5979 - -
0.7023 11200 1.9309 - -
0.7086 11300 1.6247 - -
0.7149 11400 2.1107 - -
0.7211 11500 2.1264 1.8004 -
0.7274 11600 1.7397 - -
0.7337 11700 1.9569 - -
0.7400 11800 1.4769 - -
0.7462 11900 1.6222 - -
0.7525 12000 1.5354 1.6811 -
0.7588 12100 2.2645 - -
0.7650 12200 1.8662 - -
0.7713 12300 1.5327 - -
0.7776 12400 1.9501 - -
0.7838 12500 2.0923 1.6134 -
0.7901 12600 1.8887 - -
0.7964 12700 1.7207 - -
0.8027 12800 1.8589 - -
0.8089 12900 1.7602 - -
0.8152 13000 2.2405 1.5030 -
0.8215 13100 1.6249 - -
0.8277 13200 1.6814 - -
0.8340 13300 1.4072 - -
0.8403 13400 1.6286 - -
0.8466 13500 2.2081 1.6078 -
0.8528 13600 1.7387 - -
0.8591 13700 1.5268 - -
0.8654 13800 1.5693 - -
0.8716 13900 1.2473 - -
0.8779 14000 1.361 1.7168 -
0.8842 14100 1.5246 - -
0.8904 14200 1.7266 - -
0.8967 14300 0.9221 - -
0.9030 14400 1.6397 - -
0.9093 14500 1.3139 1.5492 -
0.9155 14600 1.7942 - -
0.9218 14700 1.5206 - -
0.9281 14800 1.5868 - -
0.9343 14900 1.2131 - -
0.9406 15000 1.8765 1.4192 -
0.9469 15100 1.624 - -
0.9532 15200 1.4692 - -
0.9594 15300 1.5426 - -
0.9657 15400 1.3668 - -
0.9720 15500 1.3951 1.6835 -
0.9782 15600 1.1567 - -
0.9845 15700 1.8634 - -
0.9908 15800 1.641 - -
0.9971 15900 1.6458 - -
1.0033 16000 1.1369 1.5746 -
1.0096 16100 1.1913 - -
1.0159 16200 1.5563 - -
1.0221 16300 1.4081 - -
1.0284 16400 1.8157 - -
1.0347 16500 1.6405 1.5235 -
1.0409 16600 0.9207 - -
1.0472 16700 1.4301 - -
1.0535 16800 1.4566 - -
1.0598 16900 1.5397 - -
1.0660 17000 1.3417 1.3883 -
1.0723 17100 0.9769 - -
1.0786 17200 1.3734 - -
1.0848 17300 1.0874 - -
1.0911 17400 1.2601 - -
1.0974 17500 1.4799 1.4361 -
1.1037 17600 1.1086 - -
1.1099 17700 1.3731 - -
1.1162 17800 1.0515 - -
1.1225 17900 1.7916 - -
1.1287 18000 1.7606 1.3792 -
1.1350 18100 1.3844 - -
1.1413 18200 1.3567 - -
1.1476 18300 1.4322 - -
1.1538 18400 1.9509 - -
1.1601 18500 1.0303 1.3425 -
1.1664 18600 1.6484 - -
1.1726 18700 1.1177 - -
1.1789 18800 1.0295 - -
1.1852 18900 1.4364 - -
1.1914 19000 1.1954 1.3385 -
1.1977 19100 1.1944 - -
1.2040 19200 0.9109 - -
1.2103 19300 1.4191 - -
1.2165 19400 1.5755 - -
1.2228 19500 1.0958 1.2872 -
1.2291 19600 0.9054 - -
1.2353 19700 1.0892 - -
1.2416 19800 1.4455 - -
1.2479 19900 1.3273 - -
1.2542 20000 1.6442 1.2880 -
1.2604 20100 1.1901 - -
1.2667 20200 0.9871 - -
1.2730 20300 1.6448 - -
1.2792 20400 1.1899 - -
1.2855 20500 1.3454 1.3303 -
1.2918 20600 1.4376 - -
1.2980 20700 1.0356 - -
1.3043 20800 1.7588 - -
1.3106 20900 1.0993 - -
1.3169 21000 1.3673 1.2607 -
1.3231 21100 1.3326 - -
1.3294 21200 1.3618 - -
1.3357 21300 1.3123 - -
1.3419 21400 0.9771 - -
1.3482 21500 1.1626 1.2873 -
1.3545 21600 1.41 - -
1.3608 21700 1.6998 - -
1.3670 21800 0.8335 - -
1.3733 21900 1.579 - -
1.3796 22000 1.6073 1.2164 -
1.3858 22100 1.0534 - -
1.3921 22200 1.0045 - -
1.3984 22300 1.4195 - -
1.4047 22400 1.4409 - -
1.4109 22500 1.3942 1.2018 -
1.4172 22600 1.6013 - -
1.4235 22700 1.139 - -
1.4297 22800 0.7062 - -
1.4360 22900 1.1948 - -
1.4423 23000 1.6784 1.1736 -
1.4485 23100 1.1618 - -
1.4548 23200 0.827 - -
1.4611 23300 1.0041 - -
1.4674 23400 0.7447 - -
1.4736 23500 1.1531 1.0797 -
1.4799 23600 1.0904 - -
1.4862 23700 1.0648 - -
1.4924 23800 1.1863 - -
1.4987 23900 0.893 - -
1.5050 24000 1.2528 1.0737 -
1.5113 24100 0.9333 - -
1.5175 24200 1.3404 - -
1.5238 24300 0.8959 - -
1.5301 24400 0.6898 - -
1.5363 24500 0.9896 1.1813 -
1.5426 24600 0.7928 - -
1.5489 24700 1.4153 - -
1.5552 24800 1.2393 - -
1.5614 24900 0.744 - -
1.5677 25000 0.7545 1.0823 -
1.5740 25100 1.1936 - -
1.5802 25200 0.8755 - -
1.5865 25300 1.063 - -
1.5928 25400 0.8634 - -
1.5990 25500 1.2905 1.0718 -
1.6053 25600 1.0906 - -
1.6116 25700 1.1594 - -
1.6179 25800 1.108 - -
1.6241 25900 1.2538 - -
1.6304 26000 1.3377 1.1370 -
1.6367 26100 0.8156 - -
1.6429 26200 0.9753 - -
1.6492 26300 1.0909 - -
1.6555 26400 1.0029 - -
1.6618 26500 0.6841 1.0385 -
1.6680 26600 1.1673 - -
1.6743 26700 1.3606 - -
1.6806 26800 0.4306 - -
1.6868 26900 1.0989 - -
1.6931 27000 1.3283 1.0136 -
1.6994 27100 1.0206 - -
1.7056 27200 0.6866 - -
1.7119 27300 0.9168 - -
1.7182 27400 0.9472 - -
1.7245 27500 0.7866 1.0890 -
1.7307 27600 1.481 - -
1.7370 27700 1.0311 - -
1.7433 27800 1.3346 - -
1.7495 27900 0.8331 - -
1.7558 28000 1.3056 0.9919 -
1.7621 28100 0.9692 - -
1.7684 28200 0.9337 - -
1.7746 28300 1.1588 - -
1.7809 28400 1.0859 - -
1.7872 28500 0.9939 1.0109 -
1.7934 28600 1.4019 - -
1.7997 28700 0.9404 - -
1.8060 28800 0.7085 - -
1.8123 28900 1.1423 - -
1.8185 29000 0.8389 0.9510 -
1.8248 29100 1.3947 - -
1.8311 29200 0.8909 - -
1.8373 29300 1.3824 - -
1.8436 29400 0.6364 - -
1.8499 29500 1.2197 0.9501 -
1.8561 29600 0.6353 - -
1.8624 29700 1.3453 - -
1.8687 29800 1.1069 - -
1.8750 29900 0.9873 - -
1.8812 30000 0.9291 1.0391 -
1.8875 30100 1.3971 - -
1.8938 30200 1.0569 - -
1.9000 30300 0.6731 - -
1.9063 30400 1.0216 - -
1.9126 30500 1.295 0.9819 -
1.9189 30600 1.1641 - -
1.9251 30700 0.9199 - -
1.9314 30800 0.9774 - -
1.9377 30900 0.8242 - -
1.9439 31000 1.4039 0.9666 -
1.9502 31100 0.7112 - -
1.9565 31200 0.846 - -
1.9628 31300 1.0952 - -
1.9690 31400 1.0372 - -
1.9753 31500 0.9585 0.8983 -
1.9816 31600 1.1527 - -
1.9878 31700 0.7675 - -
1.9941 31800 0.8359 - -
2.0004 31900 1.1224 - -
2.0066 32000 1.3421 0.9575 -
2.0129 32100 0.9171 - -
2.0192 32200 0.5865 - -
2.0255 32300 0.9239 - -
2.0317 32400 0.7426 - -
2.0380 32500 0.8965 0.9158 -
2.0443 32600 0.6605 - -
2.0505 32700 0.8507 - -
2.0568 32800 0.7288 - -
2.0631 32900 0.6888 - -
2.0694 33000 0.8745 0.9568 -
2.0756 33100 0.7972 - -
2.0819 33200 0.6211 - -
2.0882 33300 1.0126 - -
2.0944 33400 0.8268 - -
2.1007 33500 0.9723 0.8551 -
2.1070 33600 0.6366 - -
2.1133 33700 0.6773 - -
2.1195 33800 0.7676 - -
2.1258 33900 0.9192 - -
2.1321 34000 0.7054 0.8941 -
2.1383 34100 0.7349 - -
2.1446 34200 0.6288 - -
2.1509 34300 0.799 - -
2.1571 34400 0.7492 - -
2.1634 34500 1.0967 0.8746 -
2.1697 34600 0.7628 - -
2.1760 34700 0.7697 - -
2.1822 34800 0.7458 - -
2.1885 34900 0.7868 - -
2.1948 35000 0.9526 0.8620 -
2.2010 35100 0.6087 - -
2.2073 35200 0.8602 - -
2.2136 35300 0.8906 - -
2.2199 35400 0.6012 - -
2.2261 35500 0.9625 0.9094 -
2.2324 35600 0.8622 - -
2.2387 35700 0.9015 - -
2.2449 35800 1.0395 - -
2.2512 35900 0.5582 - -
2.2575 36000 0.7266 0.8666 -
2.2637 36100 0.6806 - -
2.2700 36200 0.9246 - -
2.2763 36300 0.7452 - -
2.2826 36400 0.7886 - -
2.2888 36500 0.9288 0.8529 -
2.2951 36600 1.2166 - -
2.3014 36700 0.9566 - -
2.3076 36800 0.7842 - -
2.3139 36900 0.6815 - -
2.3202 37000 0.78 0.8212 -
2.3265 37100 0.8306 - -
2.3327 37200 0.8073 - -
2.3390 37300 0.7565 - -
2.3453 37400 0.8478 - -
2.3515 37500 1.0159 0.8735 -
2.3578 37600 0.8126 - -
2.3641 37700 0.751 - -
2.3704 37800 0.7185 - -
2.3766 37900 0.7429 - -
2.3829 38000 0.7149 0.7997 -
2.3892 38100 0.6867 - -
2.3954 38200 0.608 - -
2.4017 38300 0.5687 - -
2.4080 38400 0.6623 - -
2.4142 38500 0.7751 0.7834 -
2.4205 38600 0.6537 - -
2.4268 38700 0.7121 - -
2.4331 38800 0.7864 - -
2.4393 38900 0.296 - -
2.4456 39000 0.4544 0.8051 -
2.4519 39100 0.4543 - -
2.4581 39200 0.9965 - -
2.4644 39300 0.4595 - -
2.4707 39400 0.7557 - -
2.4770 39500 0.6006 0.8437 -
2.4832 39600 0.695 - -
2.4895 39700 0.6292 - -
2.4958 39800 0.7392 - -
2.5020 39900 0.6547 - -
2.5083 40000 0.739 0.8443 -
2.5146 40100 0.5618 - -
2.5209 40200 0.861 - -
2.5271 40300 0.7318 - -
2.5334 40400 0.9021 - -
2.5397 40500 0.7329 0.8595 -
2.5459 40600 0.9691 - -
2.5522 40700 1.0524 - -
2.5585 40800 0.4546 - -
2.5647 40900 0.8917 - -
2.5710 41000 0.6644 0.8664 -
2.5773 41100 0.5167 - -
2.5836 41200 0.6499 - -
2.5898 41300 0.8096 - -
2.5961 41400 0.7269 - -
2.6024 41500 0.8561 0.8173 -
2.6086 41600 0.761 - -
2.6149 41700 1.0167 - -
2.6212 41800 0.763 - -
2.6275 41900 0.6659 - -
2.6337 42000 0.7299 0.8343 -
2.6400 42100 0.7045 - -
2.6463 42200 0.9054 - -
2.6525 42300 0.3002 - -
2.6588 42400 0.7728 - -
2.6651 42500 0.8214 0.8112 -
2.6713 42600 0.6762 - -
2.6776 42700 0.8863 - -
2.6839 42800 0.7438 - -
2.6902 42900 0.5968 - -
2.6964 43000 0.5292 0.7920 -
2.7027 43100 0.429 - -
2.7090 43200 0.6001 - -
2.7152 43300 0.7253 - -
2.7215 43400 0.9268 - -
2.7278 43500 0.9536 0.8434 -
2.7341 43600 0.6164 - -
2.7403 43700 0.8411 - -
2.7466 43800 1.0441 - -
2.7529 43900 0.6473 - -
2.7591 44000 0.8697 0.8089 -
2.7654 44100 0.7743 - -
2.7717 44200 0.9118 - -
2.7780 44300 0.7464 - -
2.7842 44400 0.7195 - -
2.7905 44500 0.9814 0.8122 -
2.7968 44600 0.5812 - -
2.8030 44700 0.5095 - -
2.8093 44800 0.7771 - -
2.8156 44900 0.6714 - -
2.8218 45000 0.5836 0.7786 -
2.8281 45100 1.0708 - -
2.8344 45200 0.576 - -
2.8407 45300 0.9657 - -
2.8469 45400 0.8103 - -
2.8532 45500 0.4644 0.7895 -
2.8595 45600 0.7485 - -
2.8657 45700 0.9843 - -
2.8720 45800 0.8462 - -
2.8783 45900 0.9025 - -
2.8846 46000 0.7014 0.8031 -
2.8908 46100 0.5638 - -
2.8971 46200 0.6016 - -
2.9034 46300 0.7257 - -
2.9096 46400 1.1182 - -
2.9159 46500 1.0352 0.8031 -
2.9222 46600 0.8413 - -
2.9285 46700 0.7341 - -
2.9347 46800 0.7115 - -
2.9410 46900 0.9124 - -
2.9473 47000 0.7988 0.7591 -
2.9535 47100 0.8373 - -
2.9598 47200 0.8587 - -
2.9661 47300 0.4961 - -
2.9723 47400 0.7349 - -
2.9786 47500 0.5285 0.7255 -
2.9849 47600 0.3715 - -
2.9912 47700 0.811 - -
2.9974 47800 0.6716 - -
3.0037 47900 0.4408 - -
3.0100 48000 0.7449 0.7503 -
3.0162 48100 0.4491 - -
3.0225 48200 0.5995 - -
3.0288 48300 0.6073 - -
3.0351 48400 0.5753 - -
3.0413 48500 0.6204 0.7650 -
3.0476 48600 0.9864 - -
3.0539 48700 0.6648 - -
3.0601 48800 0.4662 - -
3.0664 48900 0.5638 - -
3.0727 49000 0.6692 0.7381 -
3.0789 49100 0.6403 - -
3.0852 49200 0.5042 - -
3.0915 49300 0.4447 - -
3.0978 49400 0.5983 - -
3.1040 49500 0.6961 0.7289 -
3.1103 49600 0.8092 - -
3.1166 49700 0.4172 - -
3.1228 49800 0.6542 - -
3.1291 49900 0.8016 - -
3.1354 50000 0.3927 0.7370 -
3.1417 50100 0.4724 - -
3.1479 50200 0.46 - -
3.1542 50300 0.4258 - -
3.1605 50400 0.5053 - -
3.1667 50500 0.3406 0.7210 -
3.1730 50600 0.6276 - -
3.1793 50700 0.5913 - -
3.1856 50800 0.3902 - -
3.1918 50900 0.5063 - -
3.1981 51000 0.7909 0.7442 -
3.2044 51100 0.5071 - -
3.2106 51200 0.5611 - -
3.2169 51300 0.545 - -
3.2232 51400 0.4359 - -
3.2294 51500 0.5249 0.7148 -
3.2357 51600 0.6759 - -
3.2420 51700 0.5458 - -
3.2483 51800 0.5195 - -
3.2545 51900 0.292 - -
3.2608 52000 0.4826 0.7129 -
3.2671 52100 0.2496 - -
3.2733 52200 0.6702 - -
3.2796 52300 0.3192 - -
3.2859 52400 0.66 - -
3.2922 52500 0.6472 0.7146 -
3.2984 52600 0.4482 - -
3.3047 52700 0.6618 - -
3.3110 52800 0.4424 - -
3.3172 52900 0.6157 - -
3.3235 53000 0.5087 0.7036 -
3.3298 53100 0.5148 - -
3.3361 53200 0.386 - -
3.3423 53300 0.3552 - -
3.3486 53400 0.5609 - -
3.3549 53500 0.3549 0.7148 -
3.3611 53600 0.3099 - -
3.3674 53700 0.2903 - -
3.3737 53800 0.7385 - -
3.3799 53900 0.7025 - -
3.3862 54000 0.5625 0.7014 -
3.3925 54100 0.7545 - -
3.3988 54200 0.4371 - -
3.4050 54300 0.4588 - -
3.4113 54400 0.4973 - -
3.4176 54500 0.4534 0.7010 -
3.4238 54600 0.6761 - -
3.4301 54700 0.6559 - -
3.4364 54800 0.6087 - -
3.4427 54900 0.601 - -
3.4489 55000 0.4894 0.6706 -
3.4552 55100 0.6524 - -
3.4615 55200 0.8268 - -
3.4677 55300 0.1795 - -
3.4740 55400 0.5667 - -
3.4803 55500 0.4185 0.6823 -
3.4865 55600 0.615 - -
3.4928 55700 0.6231 - -
3.4991 55800 0.3809 - -
3.5054 55900 0.6747 - -
3.5116 56000 0.6484 0.6736 -
3.5179 56100 0.6208 - -
3.5242 56200 0.2345 - -
3.5304 56300 0.4494 - -
3.5367 56400 0.327 - -
3.5430 56500 0.5614 0.6762 -
3.5493 56600 0.8796 - -
3.5555 56700 0.6068 - -
3.5618 56800 0.4918 - -
3.5681 56900 0.7352 - -
3.5743 57000 0.4149 0.6881 -
3.5806 57100 0.3746 - -
3.5869 57200 0.7055 - -
3.5932 57300 0.5557 - -
3.5994 57400 0.7734 - -
3.6057 57500 0.5263 0.6800 -
3.6120 57600 0.4527 - -
3.6182 57700 0.8339 - -
3.6245 57800 0.7004 - -
3.6308 57900 0.5068 - -
3.6370 58000 0.6601 0.6667 -
3.6433 58100 0.8452 - -
3.6496 58200 0.2345 - -
3.6559 58300 0.6034 - -
3.6621 58400 0.8962 - -
3.6684 58500 0.5844 0.6755 -
3.6747 58600 0.6827 - -
3.6809 58700 0.4087 - -
3.6872 58800 0.6221 - -
3.6935 58900 0.777 - -
3.6998 59000 0.572 0.6737 -
3.7060 59100 0.5479 - -
3.7123 59200 0.5078 - -
3.7186 59300 0.6982 - -
3.7248 59400 0.2223 - -
3.7311 59500 0.5361 0.6709 -
3.7374 59600 0.6072 - -
3.7437 59700 0.35 - -
3.7499 59800 0.8802 - -
3.7562 59900 0.6216 - -
3.7625 60000 0.2514 0.6836 -
3.7687 60100 0.6285 - -
3.7750 60200 0.9845 - -
3.7813 60300 0.5355 - -
3.7875 60400 0.495 - -
3.7938 60500 0.6905 0.6725 -
3.8001 60600 0.563 - -
3.8064 60700 0.6067 - -
3.8126 60800 0.7585 - -
3.8189 60900 0.4283 - -
3.8252 61000 0.4758 0.6600 -
3.8314 61100 0.5462 - -
3.8377 61200 0.649 - -
3.8440 61300 0.5576 - -
3.8503 61400 0.6717 - -
3.8565 61500 0.2951 0.6613 -
3.8628 61600 0.457 - -
3.8691 61700 0.473 - -
3.8753 61800 0.5181 - -
3.8816 61900 0.4581 - -
3.8879 62000 0.6875 0.6669 -
3.8941 62100 0.3821 - -
3.9004 62200 0.5039 - -
3.9067 62300 0.6809 - -
3.9130 62400 0.3591 - -
3.9192 62500 0.6695 0.6654 -
3.9255 62600 0.5352 - -
3.9318 62700 0.8635 - -
3.9380 62800 0.73 - -
3.9443 62900 0.4138 - -
3.9506 63000 0.3704 0.6620 -
3.9569 63100 0.4831 - -
3.9631 63200 0.5405 - -
3.9694 63300 0.6123 - -
3.9757 63400 0.5167 - -
3.9819 63500 0.6967 0.6613 -
3.9882 63600 0.338 - -
3.9945 63700 0.515 - -

Framework Versions

  • Python: 3.8.10
  • Sentence Transformers: 3.1.1
  • Transformers: 4.45.1
  • PyTorch: 2.4.0+cu121
  • Accelerate: 0.34.2
  • Datasets: 3.0.1
  • Tokenizers: 0.20.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}