kr-manish's picture
Add new SentenceTransformer model.
32d8a9b verified
---
base_model: BAAI/bge-base-en-v1.5
datasets: []
language: []
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:160
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: Priya Softweb emphasizes the importance of maintaining a clean
and organized workspace. The company's HR policies clearly state that employees
are responsible for keeping their assigned workspaces clean, orderly, and free
from unnecessary items. Spitting tobacco, gum, or other substances in the washrooms
is strictly prohibited. The company believes that a clean and organized work environment
contributes to a more efficient and professional work experience for everyone.
This emphasis on cleanliness reflects the company's commitment to creating a pleasant
and hygienic workspace for its employees.
sentences:
- What is Priya Softweb's policy on the use of mobile phones during work hours?
- What steps does Priya Softweb take to ensure that the workspace is clean and organized?
- What are the repercussions for employees who violate the Non-Disclosure Agreement
at Priya Softweb?
- source_sentence: Priya Softweb provides allocated basement parking facilities for
employees to park their two-wheelers and four-wheelers. However, parking on the
ground floor, around the lawn or main premises, is strictly prohibited as this
space is reserved for Directors. Employees should use the parking under wings
5 and 6, while other parking spaces are allocated to different wings. Parking
two-wheelers in the car parking zone is not permitted, even if space is available.
Two-wheelers should be parked in the designated basement space on the main stand,
not on the side stand. Employees are encouraged to park in common spaces on a
first-come, first-served basis. The company clarifies that it is not responsible
for providing parking and that employees park their vehicles at their own risk.
This comprehensive parking policy ensures organized parking arrangements and clarifies
the company's liability regarding vehicle safety.
sentences:
- What is the application process for planned leaves at Priya Softweb?
- What are the parking arrangements at Priya Softweb?
- What is the process for reporting a security breach at Priya Softweb?
- source_sentence: The Diwali bonus at Priya Softweb is a discretionary benefit linked
to the company's business performance. Distributed during the festive season of
Diwali, it serves as a gesture of appreciation for employees' contributions throughout
the year. However, it's important to note that employees currently under the notice
period are not eligible for this bonus. This distinction highlights that the bonus
is intended to reward ongoing commitment and contribution to the company's success.
sentences:
- What steps does Priya Softweb take to promote responsible use of company resources?
- How does Priya Softweb demonstrate its commitment to Diversity, Equity, and Inclusion
(DEI)?
- What is the significance of the company's Diwali bonus at Priya Softweb?
- source_sentence: Priya Softweb's HR Manual paints a picture of a company that values
its employees while upholding a strong sense of professionalism and ethical conduct.
The company emphasizes a structured and transparent approach to its HR processes,
ensuring clarity and fairness in areas like recruitment, performance appraisals,
compensation, leave management, work-from-home arrangements, and incident reporting.
The manual highlights the importance of compliance with company policies, promotes
diversity and inclusion, and encourages a culture of continuous learning and development.
Overall, the message conveyed is one of creating a supportive, respectful, and
growth-oriented work environment for all employees.
sentences:
- What is the overall message conveyed by Priya Softweb's HR Manual?
- What is the process for reporting employee misconduct at Priya Softweb?
- What is Priya Softweb's policy on salary disbursement and payslips?
- source_sentence: No, work-from-home arrangements do not affect an employee's employment
terms, compensation, and benefits at Priya Softweb. This clarifies that work-from-home
is a flexible work arrangement and does not impact the employee's overall employment
status or benefits.
sentences:
- Do work-from-home arrangements affect compensation and benefits at Priya Softweb?
- What is the objective of the Work From Home Policy at Priya Softweb?
- What is the procedure for a new employee joining Priya Softweb?
model-index:
- name: SentenceTransformer based on BAAI/bge-base-en-v1.5
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.6111111111111112
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7777777777777778
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7777777777777778
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8333333333333334
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6111111111111112
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.25925925925925924
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.15555555555555559
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08333333333333334
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6111111111111112
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.7777777777777778
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7777777777777778
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8333333333333334
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7192441461309548
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.6828703703703703
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6895641882483987
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.5555555555555556
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7777777777777778
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7777777777777778
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8333333333333334
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5555555555555556
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.25925925925925924
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.15555555555555559
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08333333333333334
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.5555555555555556
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.7777777777777778
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7777777777777778
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8333333333333334
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6972735740811556
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.6537037037037037
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6594551282051282
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.4444444444444444
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6666666666666666
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7777777777777778
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8888888888888888
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4444444444444444
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2222222222222222
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.15555555555555559
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.0888888888888889
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.4444444444444444
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6666666666666666
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7777777777777778
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8888888888888888
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6562432565194594
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5836419753086418
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5862843837990037
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.4444444444444444
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6666666666666666
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7222222222222222
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7777777777777778
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4444444444444444
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2222222222222222
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1444444444444445
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07777777777777779
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.4444444444444444
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6666666666666666
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7222222222222222
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7777777777777778
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6173875222934583
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5653439153439153
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5728811234914597
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.3888888888888889
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6111111111111112
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6666666666666666
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7777777777777778
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3888888888888889
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2037037037037037
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.13333333333333336
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07777777777777779
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.3888888888888889
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6111111111111112
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6666666666666666
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7777777777777778
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5654500657830313
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.49922839506172845
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5078970140244651
name: Cosine Map@100
---
# SentenceTransformer based on BAAI/bge-base-en-v1.5
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("kr-manish/bge-base-financial-matryoshka")
# Run inference
sentences = [
"No, work-from-home arrangements do not affect an employee's employment terms, compensation, and benefits at Priya Softweb. This clarifies that work-from-home is a flexible work arrangement and does not impact the employee's overall employment status or benefits.",
'Do work-from-home arrangements affect compensation and benefits at Priya Softweb?',
'What is the objective of the Work From Home Policy at Priya Softweb?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `dim_768`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.6111 |
| cosine_accuracy@3 | 0.7778 |
| cosine_accuracy@5 | 0.7778 |
| cosine_accuracy@10 | 0.8333 |
| cosine_precision@1 | 0.6111 |
| cosine_precision@3 | 0.2593 |
| cosine_precision@5 | 0.1556 |
| cosine_precision@10 | 0.0833 |
| cosine_recall@1 | 0.6111 |
| cosine_recall@3 | 0.7778 |
| cosine_recall@5 | 0.7778 |
| cosine_recall@10 | 0.8333 |
| cosine_ndcg@10 | 0.7192 |
| cosine_mrr@10 | 0.6829 |
| **cosine_map@100** | **0.6896** |
#### Information Retrieval
* Dataset: `dim_512`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.5556 |
| cosine_accuracy@3 | 0.7778 |
| cosine_accuracy@5 | 0.7778 |
| cosine_accuracy@10 | 0.8333 |
| cosine_precision@1 | 0.5556 |
| cosine_precision@3 | 0.2593 |
| cosine_precision@5 | 0.1556 |
| cosine_precision@10 | 0.0833 |
| cosine_recall@1 | 0.5556 |
| cosine_recall@3 | 0.7778 |
| cosine_recall@5 | 0.7778 |
| cosine_recall@10 | 0.8333 |
| cosine_ndcg@10 | 0.6973 |
| cosine_mrr@10 | 0.6537 |
| **cosine_map@100** | **0.6595** |
#### Information Retrieval
* Dataset: `dim_256`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.4444 |
| cosine_accuracy@3 | 0.6667 |
| cosine_accuracy@5 | 0.7778 |
| cosine_accuracy@10 | 0.8889 |
| cosine_precision@1 | 0.4444 |
| cosine_precision@3 | 0.2222 |
| cosine_precision@5 | 0.1556 |
| cosine_precision@10 | 0.0889 |
| cosine_recall@1 | 0.4444 |
| cosine_recall@3 | 0.6667 |
| cosine_recall@5 | 0.7778 |
| cosine_recall@10 | 0.8889 |
| cosine_ndcg@10 | 0.6562 |
| cosine_mrr@10 | 0.5836 |
| **cosine_map@100** | **0.5863** |
#### Information Retrieval
* Dataset: `dim_128`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.4444 |
| cosine_accuracy@3 | 0.6667 |
| cosine_accuracy@5 | 0.7222 |
| cosine_accuracy@10 | 0.7778 |
| cosine_precision@1 | 0.4444 |
| cosine_precision@3 | 0.2222 |
| cosine_precision@5 | 0.1444 |
| cosine_precision@10 | 0.0778 |
| cosine_recall@1 | 0.4444 |
| cosine_recall@3 | 0.6667 |
| cosine_recall@5 | 0.7222 |
| cosine_recall@10 | 0.7778 |
| cosine_ndcg@10 | 0.6174 |
| cosine_mrr@10 | 0.5653 |
| **cosine_map@100** | **0.5729** |
#### Information Retrieval
* Dataset: `dim_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.3889 |
| cosine_accuracy@3 | 0.6111 |
| cosine_accuracy@5 | 0.6667 |
| cosine_accuracy@10 | 0.7778 |
| cosine_precision@1 | 0.3889 |
| cosine_precision@3 | 0.2037 |
| cosine_precision@5 | 0.1333 |
| cosine_precision@10 | 0.0778 |
| cosine_recall@1 | 0.3889 |
| cosine_recall@3 | 0.6111 |
| cosine_recall@5 | 0.6667 |
| cosine_recall@10 | 0.7778 |
| cosine_ndcg@10 | 0.5655 |
| cosine_mrr@10 | 0.4992 |
| **cosine_map@100** | **0.5079** |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 160 training samples
* Columns: <code>positive</code> and <code>anchor</code>
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 18 tokens</li><li>mean: 93.95 tokens</li><li>max: 381 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 20.32 tokens</li><li>max: 34 tokens</li></ul> |
* Samples:
| positive | anchor |
|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------|
| <code>Priya Softweb's HR Manual provides valuable insights into the company's culture and values. Key takeaways include: * **Structure and Transparency:** The company emphasizes a structured and transparent approach to its HR processes. This is evident in its clear policies for recruitment, performance appraisals, compensation, leave management, work-from-home arrangements, and incident reporting. * **Professionalism and Ethics:** Priya Softweb places a high value on professionalism and ethical conduct. Its dress code, guidelines for mobile phone usage, and strict policies against tobacco use within the office all point toward a commitment to maintaining a professional and respectful work environment. * **Employee Well-being:** The company demonstrates a genuine concern for the well-being of its employees. This is reflected in its comprehensive leave policies, flexible work-from-home arrangements, and efforts to promote a healthy and clean workspace. * **Diversity and Inclusion:** Priya Softweb is committed to fostering a diverse and inclusive workplace, where employees from all backgrounds feel valued and respected. Its DEI policy outlines the company's commitment to equal opportunities, diverse hiring practices, and inclusive benefits and policies. * **Continuous Learning and Development:** The company encourages a culture of continuous learning and development, providing opportunities for employees to expand their skillsets and stay current with industry advancements. This is evident in its policies for Ethics & Compliance training and its encouragement of utilizing idle time for self-learning and exploring new technologies. Overall, Priya Softweb's HR Manual reveals a company culture that prioritizes structure, transparency, professionalism, employee well-being, diversity, and a commitment to continuous improvement. The company strives to create a supportive and growth-oriented work environment where employees feel valued and empowered to succeed.</code> | <code>What are the key takeaways from Priya Softweb's HR Manual regarding the company's culture and values?</code> |
| <code>Priya Softweb provides allocated basement parking facilities for employees to park their two-wheelers and four-wheelers. However, parking on the ground floor, around the lawn or main premises, is strictly prohibited as this space is reserved for Directors. Employees should use the parking under wings 5 and 6, while other parking spaces are allocated to different wings. Parking two-wheelers in the car parking zone is not permitted, even if space is available. Two-wheelers should be parked in the designated basement space on the main stand, not on the side stand. Employees are encouraged to park in common spaces on a first-come, first-served basis. The company clarifies that it is not responsible for providing parking and that employees park their vehicles at their own risk. This comprehensive parking policy ensures organized parking arrangements and clarifies the company's liability regarding vehicle safety.</code> | <code>What are the parking arrangements at Priya Softweb?</code> |
| <code>Investments and declarations must be submitted on or before the 25th of each month through OMS at Priya Softweb.</code> | <code>What is the deadline for submitting investments and declarations at Priya Softweb?</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 16
- `eval_accumulation_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | dim_128_cosine_map@100 | dim_256_cosine_map@100 | dim_512_cosine_map@100 | dim_64_cosine_map@100 | dim_768_cosine_map@100 |
|:-------:|:-----:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|:----------------------:|
| **1.0** | **1** | **0.5729** | **0.5863** | **0.6595** | **0.5079** | **0.6896** |
| 2.0 | 2 | 0.5729 | 0.5863 | 0.6595 | 0.5079 | 0.6896 |
| 3.0 | 3 | 0.5729 | 0.5863 | 0.6595 | 0.5079 | 0.6896 |
| 3.2 | 4 | 0.5729 | 0.5863 | 0.6595 | 0.5079 | 0.6896 |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.0.1
- Transformers: 4.41.2
- PyTorch: 2.1.2+cu121
- Accelerate: 0.31.0
- Datasets: 2.19.1
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->