Edit model card

SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the en-pt-br, en-es and en-pt datasets. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity
  • Training Datasets:
  • Languages: en, multilingual, ar, bg, ca, cs, da, de, el, es, et, fa, fi, fr, gl, gu, he, hi, hr, hu, hy, id, it, ja, ka, ko, ku, lt, lv, mk, mn, mr, ms, my, nb, nl, pl, pt, ro, ru, sk, sl, sq, sr, sv, th, tr, uk, ur, vi, zh

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("jvanhoof/all-MiniLM-L6-multilingual-v2-en-es-pt-pt-br")
# Run inference
sentences = [
    "So what's the problem, why has this chasm opened up, and what can we do to fix it?",
    'Então qual é o problema? Por que se abriu este abismo, e o que podemos fazer para o resolver?',
    'O que o design e a construção oferecem ao ensino público',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Knowledge Distillation

  • Datasets: en-pt-br, en-es and en-pt
  • Evaluated with MSEEvaluator
Metric en-pt-br en-es en-pt
negative_mse -0.0991 -0.11 -0.1143

Translation

Metric en-pt-br en-es en-pt
src2trg_accuracy 0.9859 0.9114 0.8921
trg2src_accuracy 0.9798 0.9046 0.8818
mean_accuracy 0.9829 0.908 0.887

Semantic Similarity

Metric Value
pearson_cosine 0.7504
spearman_cosine 0.7603

Training Details

Training Datasets

en-pt-br

  • Dataset: en-pt-br at 0c70bc6
  • Size: 405,807 training samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 24.11 tokens
    • max: 256 tokens
    • min: 6 tokens
    • mean: 37.01 tokens
    • max: 256 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    And then there are certain conceptual things that can also benefit from hand calculating, but I think they're relatively small in number. E também existem alguns aspectos conceituais que também podem se beneficiar do cálculo manual, mas eu acho que eles são relativamente poucos. [-0.0019007200608029962, 0.0689753070473671, -0.00522591220214963, 0.020715437829494476, -0.07340357452630997, ...]
    One thing I often ask about is ancient Greek and how this relates. Uma coisa sobre a qual eu pergunto com frequencia é grego antigo e como ele se relaciona a isto. [0.06295035779476166, 0.07436762005090714, 0.012160283513367176, 0.016489440575242043, -0.04803427681326866, ...]
    See, the thing we're doing right now is we're forcing people to learn mathematics. Vejam, o que estamos fazendo agora, é que estamos forçando as pessoas a aprender matemática. [0.020892487838864326, 0.04348783195018768, 0.04366326704621315, 0.006932021584361792, -0.014990451745688915, ...]
  • Loss: MSELoss

en-es

  • Dataset: en-es
  • Size: 3,439,042 training samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 24.16 tokens
    • max: 256 tokens
    • min: 5 tokens
    • mean: 35.26 tokens
    • max: 256 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    And then there are certain conceptual things that can also benefit from hand calculating, but I think they're relatively small in number. Y luego hay ciertas aspectos conceptuales que pueden beneficiarse del cálculo a mano pero creo que son relativamente pocos. [-0.0019007298396900296, 0.06897532939910889, -0.005225935019552708, 0.020715486258268356, -0.07340355962514877, ...]
    One thing I often ask about is ancient Greek and how this relates. Algo que pregunto a menudo es sobre el griego antiguo y cómo se relaciona. [0.06295035779476166, 0.07436762005090714, 0.012160283513367176, 0.016489440575242043, -0.04803427681326866, ...]
    See, the thing we're doing right now is we're forcing people to learn mathematics. Vean, lo que estamos haciendo ahora es forzar a la gente a aprender matemáticas. [0.020892487838864326, 0.04348784685134888, 0.043663300573825836, 0.0069320122711360455, -0.014990522526204586, ...]
  • Loss: MSELoss

en-pt

  • Dataset: en-pt
  • Size: 3,186,095 training samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 23.63 tokens
    • max: 256 tokens
    • min: 5 tokens
    • mean: 35.37 tokens
    • max: 256 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    And the country that does this first will, in my view, leapfrog others in achieving a new economy even, an improved economy, an improved outlook. E o país que fizer isto primeiro vai, na minha opinião, ultrapassar outros em alcançar uma nova economia até uma economia melhorada, uma visão melhorada. [-0.048315733671188354, 0.006750611122697592, 0.04261479899287224, -0.0639658197760582, 0.036691851913928986, ...]
    In fact, I even talk about us moving from what we often call now the "knowledge economy" to what we might call a "computational knowledge economy," where high-level math is integral to what everyone does in the way that knowledge currently is. De facto, eu até falo de mudarmos do que chamamos hoje a economia do conhecimento para o que poderemos chamar a economia do conhecimento computacional, onde a matemática de alto nível está integrada no que toda a gente faz da forma que o conhecimento actualmente está. [0.07536645978689194, 0.016234878450632095, 0.018208693712949753, 0.012537049129605293, -0.016377247869968414, ...]
    We can engage so many more students with this, and they can have a better time doing it. Podemos cativar tantos mais estudantes com isto, e eles podem divertir-se mais a fazê-lo. [0.046284060925245285, 0.034320130944252014, 0.05807732418179512, -0.059097982943058014, 0.01139863021671772, ...]
  • Loss: MSELoss

Evaluation Datasets

en-pt-br

  • Dataset: en-pt-br at 0c70bc6
  • Size: 992 evaluation samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 992 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 24.47 tokens
    • max: 191 tokens
    • min: 5 tokens
    • mean: 39.01 tokens
    • max: 256 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Thank you so much, Chris. Muito obrigado, Chris. [0.026920655742287636, 0.053147971630096436, 0.14048898220062256, -0.10380183160305023, -0.041187822818756104, ...]
    And it's truly a great honor to have the opportunity to come to this stage twice; I'm extremely grateful. É realmente uma grande honra ter a oportunidade de estar neste palco pela segunda vez. Estou muito agradecido. [0.024387279525399208, 0.0950012058019638, 0.12180330604314804, -0.07149265706539154, -0.018444526940584183, ...]
    I have been blown away by this conference, and I want to thank all of you for the many nice comments about what I had to say the other night. Eu fui muito aplaudido por esta conferência e quero agradecer a todos pelos muitos comentários delicados sobre o que eu tinha a dizer naquela noite. [0.015005475841462612, 0.014678296633064747, 0.1311199963092804, 0.03133270516991615, 0.06942538917064667, ...]
  • Loss: MSELoss

en-es

  • Dataset: en-es
  • Size: 9,990 evaluation samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 24.52 tokens
    • max: 191 tokens
    • min: 4 tokens
    • mean: 36.77 tokens
    • max: 252 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Thank you so much, Chris. Muchas gracias Chris. [0.026920655742287636, 0.053147971630096436, 0.14048898220062256, -0.10380183160305023, -0.041187822818756104, ...]
    And it's truly a great honor to have the opportunity to come to this stage twice; I'm extremely grateful. Y es en verdad un gran honor tener la oportunidad de venir a este escenario por segunda vez. Estoy extremadamente agradecido. [0.024387288838624954, 0.09500124305486679, 0.12180333584547043, -0.07149265706539154, -0.018444539979100227, ...]
    I have been blown away by this conference, and I want to thank all of you for the many nice comments about what I had to say the other night. He quedado conmovido por esta conferencia, y deseo agradecer a todos ustedes sus amables comentarios acerca de lo que tenía que decir la otra noche. [0.015005475841462612, 0.014678296633064747, 0.1311199963092804, 0.03133270516991615, 0.06942538917064667, ...]
  • Loss: MSELoss

en-pt

  • Dataset: en-pt
  • Size: 9,992 evaluation samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 4 tokens
    • mean: 24.01 tokens
    • max: 191 tokens
    • min: 5 tokens
    • mean: 37.14 tokens
    • max: 256 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Thank you so much, Chris. Muito obrigado, Chris. [0.02692059800028801, 0.053147926926612854, 0.14048898220062256, -0.10380185395479202, -0.041187841445207596, ...]
    And it's truly a great honor to have the opportunity to come to this stage twice; I'm extremely grateful. É realmente uma grande honra ter a oportunidade de pisar este palco pela segunda vez. Estou muito agradecido. [0.024387234821915627, 0.09500119835138321, 0.12180334329605103, -0.07149267196655273, -0.018444577232003212, ...]
    I have been blown away by this conference, and I want to thank all of you for the many nice comments about what I had to say the other night. Fiquei muito impressionado com esta conferência e quero agradecer a todos os imensos comentários simpáticos sobre o que eu tinha a dizer naquela noite. [0.015005475841462612, 0.014678296633064747, 0.1311199963092804, 0.03133270516991615, 0.06942538917064667, ...]
  • Loss: MSELoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • gradient_accumulation_steps: 16
  • num_train_epochs: 6
  • warmup_ratio: 0.15
  • bf16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 6
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.15
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss en-pt-br loss en-es loss en-pt loss en-pt-br_negative_mse en-pt-br_mean_accuracy en-es_negative_mse en-es_mean_accuracy sts17-es-en-test_spearman_cosine en-pt_negative_mse en-pt_mean_accuracy
0.0146 100 0.0033 - - - - - - - - - -
0.0291 200 0.0031 - - - - - - - - - -
0.0437 300 0.0028 - - - - - - - - - -
0.0583 400 0.0027 - - - - - - - - - -
0.0728 500 0.0026 - - - - - - - - - -
0.0874 600 0.0025 - - - - - - - - - -
0.1019 700 0.0024 - - - - - - - - - -
0.1165 800 0.0024 - - - - - - - - - -
0.1311 900 0.0023 - - - - - - - - - -
0.1456 1000 0.0022 - - - - - - - - - -
0.1602 1100 0.0022 - - - - - - - - - -
0.1748 1200 0.0021 - - - - - - - - - -
0.1893 1300 0.0021 - - - - - - - - - -
0.2039 1400 0.0021 - - - - - - - - - -
0.2185 1500 0.002 - - - - - - - - - -
0.2330 1600 0.002 - - - - - - - - - -
0.2476 1700 0.0019 - - - - - - - - - -
0.2622 1800 0.0019 - - - - - - - - - -
0.2767 1900 0.0019 - - - - - - - - - -
0.2913 2000 0.0018 0.0017 0.0017 0.0017 -0.2294 0.7319 -0.2300 0.6270 0.2838 -0.2356 0.5901
0.3058 2100 0.0018 - - - - - - - - - -
0.3204 2200 0.0018 - - - - - - - - - -
0.3350 2300 0.0017 - - - - - - - - - -
0.3495 2400 0.0017 - - - - - - - - - -
0.3641 2500 0.0017 - - - - - - - - - -
0.3787 2600 0.0017 - - - - - - - - - -
0.3932 2700 0.0017 - - - - - - - - - -
0.4078 2800 0.0016 - - - - - - - - - -
0.4224 2900 0.0016 - - - - - - - - - -
0.4369 3000 0.0016 - - - - - - - - - -
0.4515 3100 0.0016 - - - - - - - - - -
0.4660 3200 0.0015 - - - - - - - - - -
0.4806 3300 0.0015 - - - - - - - - - -
0.4952 3400 0.0015 - - - - - - - - - -
0.5097 3500 0.0015 - - - - - - - - - -
0.5243 3600 0.0015 - - - - - - - - - -
0.5389 3700 0.0015 - - - - - - - - - -
0.5534 3800 0.0014 - - - - - - - - - -
0.5680 3900 0.0014 - - - - - - - - - -
0.5826 4000 0.0014 0.0012 0.0013 0.0013 -0.1733 0.9214 -0.1805 0.8074 0.4249 -0.1861 0.7836
0.5971 4100 0.0014 - - - - - - - - - -
0.6117 4200 0.0014 - - - - - - - - - -
0.6263 4300 0.0014 - - - - - - - - - -
0.6408 4400 0.0014 - - - - - - - - - -
0.6554 4500 0.0013 - - - - - - - - - -
0.6699 4600 0.0013 - - - - - - - - - -
0.6845 4700 0.0013 - - - - - - - - - -
0.6991 4800 0.0013 - - - - - - - - - -
0.7136 4900 0.0013 - - - - - - - - - -
0.7282 5000 0.0013 - - - - - - - - - -
0.7428 5100 0.0013 - - - - - - - - - -
0.7573 5200 0.0013 - - - - - - - - - -
0.7719 5300 0.0013 - - - - - - - - - -
0.7865 5400 0.0013 - - - - - - - - - -
0.8010 5500 0.0012 - - - - - - - - - -
0.8156 5600 0.0012 - - - - - - - - - -
0.8301 5700 0.0012 - - - - - - - - - -
0.8447 5800 0.0012 - - - - - - - - - -
0.8593 5900 0.0012 - - - - - - - - - -
0.8738 6000 0.0012 0.0010 0.0010 0.0011 -0.1443 0.9617 -0.1538 0.8627 0.5948 -0.1587 0.8420
0.8884 6100 0.0012 - - - - - - - - - -
0.9030 6200 0.0012 - - - - - - - - - -
0.9175 6300 0.0012 - - - - - - - - - -
0.9321 6400 0.0012 - - - - - - - - - -
0.9467 6500 0.0012 - - - - - - - - - -
0.9612 6600 0.0012 - - - - - - - - - -
0.9758 6700 0.0012 - - - - - - - - - -
0.9904 6800 0.0011 - - - - - - - - - -
1.0049 6900 0.0011 - - - - - - - - - -
1.0195 7000 0.0011 - - - - - - - - - -
1.0340 7100 0.0011 - - - - - - - - - -
1.0486 7200 0.0011 - - - - - - - - - -
1.0632 7300 0.0011 - - - - - - - - - -
1.0777 7400 0.0011 - - - - - - - - - -
1.0923 7500 0.0011 - - - - - - - - - -
1.1069 7600 0.0011 - - - - - - - - - -
1.1214 7700 0.0011 - - - - - - - - - -
1.1360 7800 0.0011 - - - - - - - - - -
1.1506 7900 0.0011 - - - - - - - - - -
1.1651 8000 0.0011 0.0009 0.0009 0.0010 -0.1298 0.9713 -0.1396 0.8814 0.6558 -0.1442 0.8606
1.1797 8100 0.0011 - - - - - - - - - -
1.1942 8200 0.0011 - - - - - - - - - -
1.2088 8300 0.0011 - - - - - - - - - -
1.2234 8400 0.0011 - - - - - - - - - -
1.2379 8500 0.0011 - - - - - - - - - -
1.2525 8600 0.0011 - - - - - - - - - -
1.2671 8700 0.0011 - - - - - - - - - -
1.2816 8800 0.0011 - - - - - - - - - -
1.2962 8900 0.0011 - - - - - - - - - -
1.3108 9000 0.001 - - - - - - - - - -
1.3253 9100 0.001 - - - - - - - - - -
1.3399 9200 0.001 - - - - - - - - - -
1.3545 9300 0.001 - - - - - - - - - -
1.3690 9400 0.001 - - - - - - - - - -
1.3836 9500 0.001 - - - - - - - - - -
1.3981 9600 0.001 - - - - - - - - - -
1.4127 9700 0.001 - - - - - - - - - -
1.4273 9800 0.001 - - - - - - - - - -
1.4418 9900 0.001 - - - - - - - - - -
1.4564 10000 0.001 0.0008 0.0009 0.0009 -0.1218 0.9733 -0.1318 0.8898 0.6796 -0.1361 0.8698
1.4710 10100 0.001 - - - - - - - - - -
1.4855 10200 0.001 - - - - - - - - - -
1.5001 10300 0.001 - - - - - - - - - -
1.5147 10400 0.001 - - - - - - - - - -
1.5292 10500 0.001 - - - - - - - - - -
1.5438 10600 0.001 - - - - - - - - - -
1.5583 10700 0.001 - - - - - - - - - -
1.5729 10800 0.001 - - - - - - - - - -
1.5875 10900 0.001 - - - - - - - - - -
1.6020 11000 0.001 - - - - - - - - - -
1.6166 11100 0.001 - - - - - - - - - -
1.6312 11200 0.001 - - - - - - - - - -
1.6457 11300 0.001 - - - - - - - - - -
1.6603 11400 0.001 - - - - - - - - - -
1.6749 11500 0.001 - - - - - - - - - -
1.6894 11600 0.001 - - - - - - - - - -
1.7040 11700 0.001 - - - - - - - - - -
1.7186 11800 0.001 - - - - - - - - - -
1.7331 11900 0.001 - - - - - - - - - -
1.7477 12000 0.001 0.0008 0.0008 0.0009 -0.1165 0.9763 -0.1265 0.8942 0.7026 -0.1309 0.8737
1.7622 12100 0.001 - - - - - - - - - -
1.7768 12200 0.001 - - - - - - - - - -
1.7914 12300 0.001 - - - - - - - - - -
1.8059 12400 0.001 - - - - - - - - - -
1.8205 12500 0.001 - - - - - - - - - -
1.8351 12600 0.001 - - - - - - - - - -
1.8496 12700 0.001 - - - - - - - - - -
1.8642 12800 0.001 - - - - - - - - - -
1.8788 12900 0.001 - - - - - - - - - -
1.8933 13000 0.001 - - - - - - - - - -
1.9079 13100 0.001 - - - - - - - - - -
1.9224 13200 0.001 - - - - - - - - - -
1.9370 13300 0.001 - - - - - - - - - -
1.9516 13400 0.001 - - - - - - - - - -
1.9661 13500 0.001 - - - - - - - - - -
1.9807 13600 0.001 - - - - - - - - - -
1.9953 13700 0.0009 - - - - - - - - - -
2.0098 13800 0.0009 - - - - - - - - - -
2.0244 13900 0.001 - - - - - - - - - -
2.0390 14000 0.0009 0.0007 0.0008 0.0008 -0.1125 0.9788 -0.1230 0.8989 0.7158 -0.1273 0.8771
2.0535 14100 0.0009 - - - - - - - - - -
2.0681 14200 0.0009 - - - - - - - - - -
2.0827 14300 0.0009 - - - - - - - - - -
2.0972 14400 0.0009 - - - - - - - - - -
2.1118 14500 0.0009 - - - - - - - - - -
2.1263 14600 0.0009 - - - - - - - - - -
2.1409 14700 0.0009 - - - - - - - - - -
2.1555 14800 0.0009 - - - - - - - - - -
2.1700 14900 0.0009 - - - - - - - - - -
2.1846 15000 0.0009 - - - - - - - - - -
2.1992 15100 0.0009 - - - - - - - - - -
2.2137 15200 0.0009 - - - - - - - - - -
2.2283 15300 0.0009 - - - - - - - - - -
2.2429 15400 0.0009 - - - - - - - - - -
2.2574 15500 0.0009 - - - - - - - - - -
2.2720 15600 0.0009 - - - - - - - - - -
2.2865 15700 0.0009 - - - - - - - - - -
2.3011 15800 0.0009 - - - - - - - - - -
2.3157 15900 0.0009 - - - - - - - - - -
2.3302 16000 0.0009 0.0007 0.0008 0.0008 -0.1100 0.9793 -0.1203 0.9016 0.7288 -0.1246 0.8797
2.3448 16100 0.0009 - - - - - - - - - -
2.3594 16200 0.0009 - - - - - - - - - -
2.3739 16300 0.0009 - - - - - - - - - -
2.3885 16400 0.0009 - - - - - - - - - -
2.4031 16500 0.0009 - - - - - - - - - -
2.4176 16600 0.0009 - - - - - - - - - -
2.4322 16700 0.0009 - - - - - - - - - -
2.4468 16800 0.0009 - - - - - - - - - -
2.4613 16900 0.0009 - - - - - - - - - -
2.4759 17000 0.0009 - - - - - - - - - -
2.4904 17100 0.0009 - - - - - - - - - -
2.5050 17200 0.0009 - - - - - - - - - -
2.5196 17300 0.0009 - - - - - - - - - -
2.5341 17400 0.0009 - - - - - - - - - -
2.5487 17500 0.0009 - - - - - - - - - -
2.5633 17600 0.0009 - - - - - - - - - -
2.5778 17700 0.0009 - - - - - - - - - -
2.5924 17800 0.0009 - - - - - - - - - -
2.6070 17900 0.0009 - - - - - - - - - -
2.6215 18000 0.0009 0.0007 0.0008 0.0008 -0.1077 0.9798 -0.1182 0.9034 0.7356 -0.1224 0.8820
2.6361 18100 0.0009 - - - - - - - - - -
2.6506 18200 0.0009 - - - - - - - - - -
2.6652 18300 0.0009 - - - - - - - - - -
2.6798 18400 0.0009 - - - - - - - - - -
2.6943 18500 0.0009 - - - - - - - - - -
2.7089 18600 0.0009 - - - - - - - - - -
2.7235 18700 0.0009 - - - - - - - - - -
2.7380 18800 0.0009 - - - - - - - - - -
2.7526 18900 0.0009 - - - - - - - - - -
2.7672 19000 0.0009 - - - - - - - - - -
2.7817 19100 0.0009 - - - - - - - - - -
2.7963 19200 0.0009 - - - - - - - - - -
2.8109 19300 0.0009 - - - - - - - - - -
2.8254 19400 0.0009 - - - - - - - - - -
2.8400 19500 0.0009 - - - - - - - - - -
2.8545 19600 0.0009 - - - - - - - - - -
2.8691 19700 0.0009 - - - - - - - - - -
2.8837 19800 0.0009 - - - - - - - - - -
2.8982 19900 0.0009 - - - - - - - - - -
2.9128 20000 0.0009 0.0007 0.0008 0.0008 -0.1058 0.9808 -0.1165 0.9047 0.7414 -0.1208 0.8830
2.9274 20100 0.0009 - - - - - - - - - -
2.9419 20200 0.0009 - - - - - - - - - -
2.9565 20300 0.0009 - - - - - - - - - -
2.9711 20400 0.0009 - - - - - - - - - -
2.9856 20500 0.0009 - - - - - - - - - -
3.0002 20600 0.0009 - - - - - - - - - -
3.0147 20700 0.0009 - - - - - - - - - -
3.0293 20800 0.0009 - - - - - - - - - -
3.0439 20900 0.0009 - - - - - - - - - -
3.0584 21000 0.0009 - - - - - - - - - -
3.0730 21100 0.0009 - - - - - - - - - -
3.0876 21200 0.0009 - - - - - - - - - -
3.1021 21300 0.0009 - - - - - - - - - -
3.1167 21400 0.0009 - - - - - - - - - -
3.1313 21500 0.0009 - - - - - - - - - -
3.1458 21600 0.0009 - - - - - - - - - -
3.1604 21700 0.0009 - - - - - - - - - -
3.1749 21800 0.0009 - - - - - - - - - -
3.1895 21900 0.0009 - - - - - - - - - -
3.2041 22000 0.0009 0.0007 0.0008 0.0008 -0.1045 0.9803 -0.1152 0.9054 0.7456 -0.1194 0.8843
3.2186 22100 0.0009 - - - - - - - - - -
3.2332 22200 0.0009 - - - - - - - - - -
3.2478 22300 0.0009 - - - - - - - - - -
3.2623 22400 0.0009 - - - - - - - - - -
3.2769 22500 0.0009 - - - - - - - - - -
3.2915 22600 0.0009 - - - - - - - - - -
3.3060 22700 0.0009 - - - - - - - - - -
3.3206 22800 0.0009 - - - - - - - - - -
3.3352 22900 0.0009 - - - - - - - - - -
3.3497 23000 0.0009 - - - - - - - - - -
3.3643 23100 0.0009 - - - - - - - - - -
3.3788 23200 0.0009 - - - - - - - - - -
3.3934 23300 0.0009 - - - - - - - - - -
3.4080 23400 0.0009 - - - - - - - - - -
3.4225 23500 0.0009 - - - - - - - - - -
3.4371 23600 0.0009 - - - - - - - - - -
3.4517 23700 0.0009 - - - - - - - - - -
3.4662 23800 0.0009 - - - - - - - - - -
3.4808 23900 0.0009 - - - - - - - - - -
3.4954 24000 0.0009 0.0007 0.0007 0.0008 -0.1031 0.9819 -0.1141 0.9057 0.7522 -0.1183 0.8850
3.5099 24100 0.0009 - - - - - - - - - -
3.5245 24200 0.0009 - - - - - - - - - -
3.5390 24300 0.0009 - - - - - - - - - -
3.5536 24400 0.0009 - - - - - - - - - -
3.5682 24500 0.0009 - - - - - - - - - -
3.5827 24600 0.0009 - - - - - - - - - -
3.5973 24700 0.0009 - - - - - - - - - -
3.6119 24800 0.0009 - - - - - - - - - -
3.6264 24900 0.0009 - - - - - - - - - -
3.6410 25000 0.0009 - - - - - - - - - -
3.6556 25100 0.0009 - - - - - - - - - -
3.6701 25200 0.0009 - - - - - - - - - -
3.6847 25300 0.0009 - - - - - - - - - -
3.6993 25400 0.0009 - - - - - - - - - -
3.7138 25500 0.0009 - - - - - - - - - -
3.7284 25600 0.0009 - - - - - - - - - -
3.7429 25700 0.0009 - - - - - - - - - -
3.7575 25800 0.0009 - - - - - - - - - -
3.7721 25900 0.0009 - - - - - - - - - -
3.7866 26000 0.0009 0.0007 0.0007 0.0008 -0.1023 0.9824 -0.1130 0.9070 0.7516 -0.1173 0.8856
3.8012 26100 0.0009 - - - - - - - - - -
3.8158 26200 0.0009 - - - - - - - - - -
3.8303 26300 0.0009 - - - - - - - - - -
3.8449 26400 0.0009 - - - - - - - - - -
3.8595 26500 0.0009 - - - - - - - - - -
3.8740 26600 0.0009 - - - - - - - - - -
3.8886 26700 0.0009 - - - - - - - - - -
3.9031 26800 0.0009 - - - - - - - - - -
3.9177 26900 0.0009 - - - - - - - - - -
3.9323 27000 0.0009 - - - - - - - - - -
3.9468 27100 0.0009 - - - - - - - - - -
3.9614 27200 0.0009 - - - - - - - - - -
3.9760 27300 0.0009 - - - - - - - - - -
3.9905 27400 0.0009 - - - - - - - - - -
4.0051 27500 0.0009 - - - - - - - - - -
4.0197 27600 0.0009 - - - - - - - - - -
4.0342 27700 0.0009 - - - - - - - - - -
4.0488 27800 0.0009 - - - - - - - - - -
4.0634 27900 0.0009 - - - - - - - - - -
4.0779 28000 0.0009 0.0007 0.0007 0.0008 -0.1015 0.9834 -0.1123 0.9072 0.7539 -0.1165 0.8861
4.0925 28100 0.0009 - - - - - - - - - -
4.1070 28200 0.0009 - - - - - - - - - -
4.1216 28300 0.0009 - - - - - - - - - -
4.1362 28400 0.0009 - - - - - - - - - -
4.1507 28500 0.0009 - - - - - - - - - -
4.1653 28600 0.0009 - - - - - - - - - -
4.1799 28700 0.0009 - - - - - - - - - -
4.1944 28800 0.0009 - - - - - - - - - -
4.2090 28900 0.0009 - - - - - - - - - -
4.2236 29000 0.0009 - - - - - - - - - -
4.2381 29100 0.0009 - - - - - - - - - -
4.2527 29200 0.0009 - - - - - - - - - -
4.2672 29300 0.0009 - - - - - - - - - -
4.2818 29400 0.0009 - - - - - - - - - -
4.2964 29500 0.0009 - - - - - - - - - -
4.3109 29600 0.0009 - - - - - - - - - -
4.3255 29700 0.0009 - - - - - - - - - -
4.3401 29800 0.0009 - - - - - - - - - -
4.3546 29900 0.0009 - - - - - - - - - -
4.3692 30000 0.0009 0.0007 0.0007 0.0007 -0.1007 0.9829 -0.1117 0.9073 0.7578 -0.1159 0.8862
4.3838 30100 0.0009 - - - - - - - - - -
4.3983 30200 0.0009 - - - - - - - - - -
4.4129 30300 0.0009 - - - - - - - - - -
4.4275 30400 0.0009 - - - - - - - - - -
4.4420 30500 0.0009 - - - - - - - - - -
4.4566 30600 0.0009 - - - - - - - - - -
4.4711 30700 0.0009 - - - - - - - - - -
4.4857 30800 0.0009 - - - - - - - - - -
4.5003 30900 0.0009 - - - - - - - - - -
4.5148 31000 0.0009 - - - - - - - - - -
4.5294 31100 0.0009 - - - - - - - - - -
4.5440 31200 0.0009 - - - - - - - - - -
4.5585 31300 0.0009 - - - - - - - - - -
4.5731 31400 0.0009 - - - - - - - - - -
4.5877 31500 0.0009 - - - - - - - - - -
4.6022 31600 0.0008 - - - - - - - - - -
4.6168 31700 0.0009 - - - - - - - - - -
4.6313 31800 0.0008 - - - - - - - - - -
4.6459 31900 0.0008 - - - - - - - - - -
4.6605 32000 0.0008 0.0007 0.0007 0.0007 -0.1003 0.9834 -0.1112 0.9076 0.7577 -0.1154 0.8868
4.6750 32100 0.0009 - - - - - - - - - -
4.6896 32200 0.0009 - - - - - - - - - -
4.7042 32300 0.0008 - - - - - - - - - -
4.7187 32400 0.0009 - - - - - - - - - -
4.7333 32500 0.0008 - - - - - - - - - -
4.7479 32600 0.0008 - - - - - - - - - -
4.7624 32700 0.0008 - - - - - - - - - -
4.7770 32800 0.0008 - - - - - - - - - -
4.7916 32900 0.0008 - - - - - - - - - -
4.8061 33000 0.0008 - - - - - - - - - -
4.8207 33100 0.0008 - - - - - - - - - -
4.8352 33200 0.0008 - - - - - - - - - -
4.8498 33300 0.0008 - - - - - - - - - -
4.8644 33400 0.0008 - - - - - - - - - -
4.8789 33500 0.0008 - - - - - - - - - -
4.8935 33600 0.0008 - - - - - - - - - -
4.9081 33700 0.0008 - - - - - - - - - -
4.9226 33800 0.0008 - - - - - - - - - -
4.9372 33900 0.0008 - - - - - - - - - -
4.9518 34000 0.0008 0.0007 0.0007 0.0007 -0.0999 0.9834 -0.1107 0.9078 0.7570 -0.1150 0.8871
4.9663 34100 0.0008 - - - - - - - - - -
4.9809 34200 0.0008 - - - - - - - - - -
4.9954 34300 0.0008 - - - - - - - - - -
5.0100 34400 0.0008 - - - - - - - - - -
5.0246 34500 0.0008 - - - - - - - - - -
5.0391 34600 0.0008 - - - - - - - - - -
5.0537 34700 0.0008 - - - - - - - - - -
5.0683 34800 0.0008 - - - - - - - - - -
5.0828 34900 0.0008 - - - - - - - - - -
5.0974 35000 0.0008 - - - - - - - - - -
5.1120 35100 0.0008 - - - - - - - - - -
5.1265 35200 0.0008 - - - - - - - - - -
5.1411 35300 0.0008 - - - - - - - - - -
5.1557 35400 0.0008 - - - - - - - - - -
5.1702 35500 0.0008 - - - - - - - - - -
5.1848 35600 0.0008 - - - - - - - - - -
5.1993 35700 0.0008 - - - - - - - - - -
5.2139 35800 0.0008 - - - - - - - - - -
5.2285 35900 0.0008 - - - - - - - - - -
5.2430 36000 0.0008 0.0007 0.0007 0.0007 -0.0995 0.9824 -0.1104 0.9085 0.7588 -0.1147 0.8865
5.2576 36100 0.0008 - - - - - - - - - -
5.2722 36200 0.0008 - - - - - - - - - -
5.2867 36300 0.0008 - - - - - - - - - -
5.3013 36400 0.0008 - - - - - - - - - -
5.3159 36500 0.0008 - - - - - - - - - -
5.3304 36600 0.0008 - - - - - - - - - -
5.3450 36700 0.0008 - - - - - - - - - -
5.3595 36800 0.0008 - - - - - - - - - -
5.3741 36900 0.0008 - - - - - - - - - -
5.3887 37000 0.0008 - - - - - - - - - -
5.4032 37100 0.0008 - - - - - - - - - -
5.4178 37200 0.0008 - - - - - - - - - -
5.4324 37300 0.0008 - - - - - - - - - -
5.4469 37400 0.0008 - - - - - - - - - -
5.4615 37500 0.0008 - - - - - - - - - -
5.4761 37600 0.0008 - - - - - - - - - -
5.4906 37700 0.0008 - - - - - - - - - -
5.5052 37800 0.0008 - - - - - - - - - -
5.5198 37900 0.0008 - - - - - - - - - -
5.5343 38000 0.0008 0.0007 0.0007 0.0007 -0.0993 0.9829 -0.1102 0.9085 0.7598 -0.1144 0.8871
5.5489 38100 0.0008 - - - - - - - - - -
5.5634 38200 0.0008 - - - - - - - - - -
5.5780 38300 0.0008 - - - - - - - - - -
5.5926 38400 0.0008 - - - - - - - - - -
5.6071 38500 0.0008 - - - - - - - - - -
5.6217 38600 0.0008 - - - - - - - - - -
5.6363 38700 0.0008 - - - - - - - - - -
5.6508 38800 0.0008 - - - - - - - - - -
5.6654 38900 0.0008 - - - - - - - - - -
5.6800 39000 0.0008 - - - - - - - - - -
5.6945 39100 0.0008 - - - - - - - - - -
5.7091 39200 0.0008 - - - - - - - - - -
5.7236 39300 0.0008 - - - - - - - - - -
5.7382 39400 0.0008 - - - - - - - - - -
5.7528 39500 0.0008 - - - - - - - - - -
5.7673 39600 0.0008 - - - - - - - - - -
5.7819 39700 0.0008 - - - - - - - - - -
5.7965 39800 0.0008 - - - - - - - - - -
5.8110 39900 0.0008 - - - - - - - - - -
5.8256 40000 0.0008 0.0007 0.0007 0.0007 -0.0991 0.9829 -0.1100 0.9080 0.7603 -0.1143 0.8870
5.8402 40100 0.0008 - - - - - - - - - -
5.8547 40200 0.0008 - - - - - - - - - -
5.8693 40300 0.0008 - - - - - - - - - -
5.8839 40400 0.0008 - - - - - - - - - -
5.8984 40500 0.0008 - - - - - - - - - -
5.9130 40600 0.0008 - - - - - - - - - -
5.9275 40700 0.0008 - - - - - - - - - -
5.9421 40800 0.0008 - - - - - - - - - -
5.9567 40900 0.0008 - - - - - - - - - -
5.9712 41000 0.0008 - - - - - - - - - -
5.9858 41100 0.0008 - - - - - - - - - -

Framework Versions

  • Python: 3.9.20
  • Sentence Transformers: 3.3.0
  • Transformers: 4.46.2
  • PyTorch: 2.5.1+cu121
  • Accelerate: 1.1.1
  • Datasets: 3.1.0
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MSELoss

@inproceedings{reimers-2020-multilingual-sentence-bert,
    title = "Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2020",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/2004.09813",
}
Downloads last month
4
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jvanhoof/all-MiniLM-L6-multilingual-v2-en-es-pt-pt-br

Finetuned
(164)
this model

Dataset used to train jvanhoof/all-MiniLM-L6-multilingual-v2-en-es-pt-pt-br

Evaluation results