---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:7960
- loss:CoSENTLoss
base_model: sentence-transformers/all-mpnet-base-v2
widget:
- source_sentence: 'Okay, I got it. So just to give you the second price if ever for
the Samsung Galaxy is ##. It comes with a ## this one. Five gigabyte of data or
## gigabyte it will only it will only give you a £39.05. That is for that is for
the #### G but I do suggest that you go with the equipment before because that
is only around £31.'
sentences:
- I can provide to you . Are you happy to go ahead with this?
- Thank you for calling over to my name is how can I help you.
- Thank you and could you please confirm to me what is your full name.
- source_sentence: His number well, so you're looking to travel abroad anytime soon.
sentences:
- I'm now going to read out some terms and conditions to complete the order.
- Can you provide me with character number one of your security answer please?
- So looking at your usage of your mobile data. I just wanna share with you that
your usage for the past six months. It says here it's up to gigabytes of mobile
data. Okay and in order for us to.
- source_sentence: Hello. Hi, thank you so much for patiently waiting. So, I'd look
into our accessory so for the airbags the one that we have an ongoing promotion
right now for the accessories is the airport second generation. So you can.
sentences:
- The same discounts you can have been added as an additional line and do into your
account. It needs be entitled to % discount off of the costs.
- Are you planning to get a new sim only plan or a new phone?
- I'm now going to send you a one time code. The first message is a warning to not
give the code to scammers pretending to work for O2. The second message is the
code to continue with your request.
- source_sentence: Okay, so you can know just spend. Yeah, but anytime via web chat
or customer Services. Okay.
sentences:
- So looking at your usage of your mobile data. I just wanna share with you that
your usage for the past six months. It says here it's up to gigabytes of mobile
data. Okay and in order for us to.
- Checking your account I can see you are on the and you have been paying £ per
month. Is that correct?
- So looking at your usage of your mobile data. I just wanna share with you that
your usage for the past six months. It says here it's up to gigabytes of mobile
data. Okay and in order for us to.
- source_sentence: 'Oh, okay, so just the iPhone ## only.'
sentences:
- So I'm actually now checking here just for me to get this deal that you had seen.
- I'm now going to send you a one time code. The first message is a warning to not
give the code to scammers pretending to work for O2. The second message is the
code to continue with your request.
- Yes, that's correct for know. Our price is £ and then it won't go down to £ after
you apply the discount.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
model-index:
- name: SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts dev
type: sts_dev
metrics:
- type: pearson_cosine
value: 0.5906538719225906
name: Pearson Cosine
- type: spearman_cosine
value: 0.2789361723892506
name: Spearman Cosine
- type: pearson_manhattan
value: 0.630943535003128
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.27814879203445947
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.6348761842006896
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.2789361726048565
name: Spearman Euclidean
- type: pearson_dot
value: 0.5906538598201696
name: Pearson Dot
- type: spearman_dot
value: 0.2789361717424329
name: Spearman Dot
- type: pearson_max
value: 0.6348761842006896
name: Pearson Max
- type: spearman_max
value: 0.2789361726048565
name: Spearman Max
---
# SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2)
- **Maximum Sequence Length:** 384 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 384, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("enochlev/xlm-similarity-large")
# Run inference
sentences = [
'Oh, okay, so just the iPhone ## only.',
"Yes, that's correct for know. Our price is £ and then it won't go down to £ after you apply the discount.",
"I'm now going to send you a one time code. The first message is a warning to not give the code to scammers pretending to work for O2. The second message is the code to continue with your request.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts_dev`
* Evaluated with [EmbeddingSimilarityEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:-------------------|:-----------|
| pearson_cosine | 0.5907 |
| spearman_cosine | 0.2789 |
| pearson_manhattan | 0.6309 |
| spearman_manhattan | 0.2781 |
| pearson_euclidean | 0.6349 |
| spearman_euclidean | 0.2789 |
| pearson_dot | 0.5907 |
| spearman_dot | 0.2789 |
| pearson_max | 0.6349 |
| **spearman_max** | **0.2789** |
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 7,960 training samples
* Columns: text1
, text2
, and label
* Approximate statistics based on the first 1000 samples:
| | text1 | text2 | label |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------|
| type | string | string | float |
| details |
Hello, welcome to O2. My name is __ How can I help you today?
| Thank you for calling over to my name is how can I help you.
| 1.0
|
| Hello, welcome to O2. My name is __ How can I help you today?
| I was about to ask us to confirm the email address that we have on the account or on your file. So what I can you tell me your email address.
| 0.2
|
| Hello, welcome to O2. My name is __ How can I help you today?
| Are you planning to get a new sim only plan or a new phone?
| 0.2
|
* Loss: [CoSENTLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "pairwise_cos_sim"
}
```
### Evaluation Dataset
#### Unnamed Dataset
* Size: 1,980 evaluation samples
* Columns: text1
, text2
, and label
* Approximate statistics based on the first 1000 samples:
| | text1 | text2 | label |
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------|
| type | string | string | float |
| details | So for example, since this is for the 2nd line bro more. So if you have any family that you want to add on your account. Yeah, we do have a same offer plan. This offer promo today.
| The same discounts you can have been added as an additional line and do into your account. It needs be entitled to % discount off of the costs.
| 1.0
|
| So for example, since this is for the 2nd line bro more. So if you have any family that you want to add on your account. Yeah, we do have a same offer plan. This offer promo today.
| I was about to ask us to confirm the email address that we have on the account or on your file. So what I can you tell me your email address.
| 0.2
|
| So for example, since this is for the 2nd line bro more. So if you have any family that you want to add on your account. Yeah, we do have a same offer plan. This offer promo today.
| Are you planning to get a new sim only plan or a new phone?
| 0.2
|
* Loss: [CoSENTLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "pairwise_cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 50
- `per_device_eval_batch_size`: 50
- `learning_rate`: 2e-05
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `batch_sampler`: no_duplicates
#### All Hyperparameters