Technical Report and Model Pipeline
To access our technical report and model pipeline scripts visit our github
SentenceTransformer based on Snowflake/snowflake-arctic-embed-m
This is a sentence-transformers model finetuned from Snowflake/snowflake-arctic-embed-m. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: Snowflake/snowflake-arctic-embed-m
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 768 tokens
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'Represent this sentence for searching relevant passages: Witki, Warmian-Masurian Voivodeship 2040 Oct 12',
'Witki () is a village in the administrative district of Gmina Bartoszyce, within Bartoszyce County, Warmian-Masurian Voivodeship, in northern Poland, close to the border with the Kaliningrad Oblast of Russia. It lies approximately east of Bartoszyce and north-east of the regional capital Olsztyn. References Witki 12/10/2040\n',
'12-21-2046 This is a list of electoral results for the Electoral district of Irwin in Western Australian state elections. Members for Irwin Election results Elections in the 1940s Preferences were not distributed. Preferences were not distributed. Elections in the 1930s Preferences were not distributed. Elections in the 1920s Elections in the 1910s Elections in the 1900s Elections in the 1890s References Western Australian state electoral results by district\n',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 55,744 training samples
- Columns:
anchors
andpositive
- Approximate statistics based on the first 1000 samples:
anchors positive type string string details - min: 14 tokens
- mean: 20.33 tokens
- max: 33 tokens
- min: 17 tokens
- mean: 48.58 tokens
- max: 75 tokens
- Samples:
anchors positive Represent this sentence for searching relevant passages: are bugs attracted to citronella November 10?
Citronella is naturally occurring oil that repels insects. ... “Citronella oil is repellent to mosquitoes to a degree, but the amount being put out by a candle isn't going to be very effective,” Eric Hoffer, president of Hoffer Pest, told TODAY Home. 11/10
Represent this sentence for searching relevant passages: are bugs attracted to citronella 11/10/09?
Citronella is naturally occurring oil that repels insects. ... “Citronella oil is repellent to mosquitoes to a degree, but the amount being put out by a candle isn't going to be very effective,” Eric Hoffer, president of Hoffer Pest, told TODAY Home. 11/10/09
Represent this sentence for searching relevant passages: are bugs attracted to citronella Jan 15?
Citronella is naturally occurring oil that repels insects. ... “Citronella oil is repellent to mosquitoes to a degree, but the amount being put out by a candle isn't going to be very effective,” Eric Hoffer, president of Hoffer Pest, told TODAY Home. 01/15
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
Unnamed Dataset
- Size: 1,000 evaluation samples
- Columns:
anchors
andpositive
- Approximate statistics based on the first 1000 samples:
anchors positive type string string details - min: 12 tokens
- mean: 21.57 tokens
- max: 43 tokens
- min: 8 tokens
- mean: 66.44 tokens
- max: 512 tokens
- Samples:
anchors positive Represent this sentence for searching relevant passages: Identify a Psychopath 3/28
Psychopathy is a personality construct consisting of a cluster of characteristics used by mental health professionals to describe someone who is charming, manipulative, emotionally ruthless and potentially criminal. 03/28
Represent this sentence for searching relevant passages: what is dangerous high blood pressure in pregnancy?
A blood pressure that is greater than 130/90 mm Hg or that is 15 degrees higher on the top number from where you started before pregnancy may be cause for concern. High blood pressure during pregnancy is defined as 140 mm Hg or higher systolic, with diastolic 90 mm Hg or higher.
Represent this sentence for searching relevant passages: Be a Better Cheerleader June 22
What do you think when you think of a good cheerleader? Tight with motions? Can hold a stunt? Well, it's not just that. You need to be fit in 3 categories: mental/emotional health, social health, and physical health. 06/22
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 128per_device_eval_batch_size
: 64learning_rate
: 1.5e-05weight_decay
: 0.01num_train_epochs
: 1warmup_ratio
: 0.1warmup_steps
: 400bf16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 128per_device_eval_batch_size
: 64per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 1.5e-05weight_decay
: 0.01adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 400log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseeval_use_gather_object
: Falsebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | loss |
---|---|---|---|
0.0023 | 1 | 2.4713 | - |
0.0229 | 10 | 2.4907 | - |
0.0459 | 20 | 2.4574 | - |
0.0688 | 30 | 2.4861 | - |
0.0917 | 40 | 2.4612 | - |
0.1147 | 50 | 2.4353 | - |
0.1376 | 60 | 2.3967 | - |
0.1606 | 70 | 2.3609 | - |
0.1835 | 80 | 2.3079 | - |
0.2064 | 90 | 2.1928 | - |
0.2294 | 100 | 2.1581 | - |
0.2523 | 110 | 2.0822 | - |
0.2752 | 120 | 1.9739 | - |
0.2982 | 130 | 1.8393 | - |
0.3211 | 140 | 1.7397 | - |
0.3440 | 150 | 1.5249 | - |
0.3670 | 160 | 1.4281 | - |
0.3899 | 170 | 1.3197 | - |
0.4128 | 180 | 1.211 | - |
0.4358 | 190 | 1.1086 | - |
0.4587 | 200 | 0.9598 | 0.2301 |
0.4817 | 210 | 1.0904 | - |
0.5046 | 220 | 0.9813 | - |
0.5275 | 230 | 1.1148 | - |
0.5505 | 240 | 1.2813 | - |
0.5734 | 250 | 1.2259 | - |
0.5963 | 260 | 1.221 | - |
0.6193 | 270 | 1.1547 | - |
0.6422 | 280 | 1.1286 | - |
0.6651 | 290 | 0.9932 | - |
0.6881 | 300 | 0.978 | - |
0.7110 | 310 | 0.9505 | - |
0.7339 | 320 | 0.8731 | - |
0.7569 | 330 | 0.824 | - |
0.7798 | 340 | 0.8979 | - |
0.8028 | 350 | 1.756 | - |
0.8257 | 360 | 1.6785 | - |
0.8486 | 370 | 1.5944 | - |
0.8716 | 380 | 1.5417 | - |
0.8945 | 390 | 1.4788 | - |
0.9174 | 400 | 0.9873 | 0.0695 |
0.9404 | 410 | 0.1664 | - |
0.9633 | 420 | 0.1336 | - |
0.9862 | 430 | 0.1193 | - |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.0.1
- Transformers: 4.43.4
- PyTorch: 2.4.0+cu121
- Accelerate: 0.33.0
- Datasets: 2.20.0
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for khoj-ai/timely-arctic-medium
Base model
Snowflake/snowflake-arctic-embed-m