SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
This is a sentence-transformers model finetuned from sentence-transformers/all-mpnet-base-v2. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: sentence-transformers/all-mpnet-base-v2
- Maximum Sequence Length: 384 tokens
- Output Dimensionality: 768 tokens
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 384, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("cruzlorite/all-mpnet-base-v2-unfair-tos-rationale")
# Run inference
sentences = [
'we may change the price of the services at any time and if you have a recurring purchase , we will notify you by email at least 15 days before the price change .',
'Since the clause states that the provider has the right for unilateral change of the contract/services/goods/features for any reason at its full discretion, at any time ',
'Since the clause states that the provider has the right for unilateral change of the contract/services/goods/features for any reason at its full discretion, at any time ',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Binary Classification
- Dataset:
eval
- Evaluated with
BinaryClassificationEvaluator
Metric | Value |
---|---|
cosine_accuracy | 0.8889 |
cosine_accuracy_threshold | 0.7394 |
cosine_f1 | 0.8966 |
cosine_f1_threshold | 0.7285 |
cosine_precision | 0.8608 |
cosine_recall | 0.9356 |
cosine_ap | 0.9473 |
dot_accuracy | 0.8889 |
dot_accuracy_threshold | 0.7394 |
dot_f1 | 0.8966 |
dot_f1_threshold | 0.7285 |
dot_precision | 0.8608 |
dot_recall | 0.9356 |
dot_ap | 0.9473 |
manhattan_accuracy | 0.8889 |
manhattan_accuracy_threshold | 15.6134 |
manhattan_f1 | 0.8969 |
manhattan_f1_threshold | 15.9017 |
manhattan_precision | 0.859 |
manhattan_recall | 0.9384 |
manhattan_ap | 0.9479 |
euclidean_accuracy | 0.8889 |
euclidean_accuracy_threshold | 0.722 |
euclidean_f1 | 0.8966 |
euclidean_f1_threshold | 0.7369 |
euclidean_precision | 0.8608 |
euclidean_recall | 0.9356 |
euclidean_ap | 0.9473 |
max_accuracy | 0.8889 |
max_accuracy_threshold | 15.6134 |
max_f1 | 0.8969 |
max_f1_threshold | 15.9017 |
max_precision | 0.8608 |
max_recall | 0.9384 |
max_ap | 0.9479 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 6,233 training samples
- Columns:
sentence1
,sentence2
, andlabel
- Approximate statistics based on the first 1000 samples:
sentence1 sentence2 label type string string int details - min: 8 tokens
- mean: 63.0 tokens
- max: 384 tokens
- min: 10 tokens
- mean: 41.12 tokens
- max: 96 tokens
- 0: ~48.70%
- 1: ~51.30%
- Samples:
sentence1 sentence2 label we may revise these terms from time to time and the most current version will always be posted on our website .
Since the clause states that the provider has the right for unilateral change of the contract/services/goods/features where the notification of changes is left at a full discretion of the provider such as by simply posting the new terms on their website without a notification to the consumer
1
neither fitbit , its suppliers , or licensors , nor any other party involved in creating , producing , or delivering the fitbit service will be liable for any incidental , special , exemplary , or consequential damages , including lost profits , loss of data or goodwill , service interruption , computer damage , or system failure or the cost of substitute services arising out of or in connection with these terms or from the use of or inability to use the fitbit service , whether based on warranty , contract , tort -lrb- including negligence -rrb- , product liability , or any other legal theory , and whether or not fitbit has been informed of the possibility of such damage , even if a limited remedy set forth herein is found to have failed of its essential purpose .
since the clause states that the provider is not liable even if he was, or should have been, aware or have been advised about the possibility of any damage or loss
1
the company reserves the right -lrb- but has no obligation -rrb- , at its sole discretion and without prior notice to :
Since the clause states that the provider has the right to remove content and material if he believes that there is a case violation of terms such as acount tranfer, policies, standard, code of conduct
1
- Loss:
OnlineContrastiveLoss
Evaluation Dataset
Unnamed Dataset
- Size: 693 evaluation samples
- Columns:
sentence1
,sentence2
, andlabel
- Approximate statistics based on the first 693 samples:
sentence1 sentence2 label type string string int details - min: 8 tokens
- mean: 63.59 tokens
- max: 384 tokens
- min: 10 tokens
- mean: 42.75 tokens
- max: 96 tokens
- 0: ~48.48%
- 1: ~51.52%
- Samples:
sentence1 sentence2 label you expressly understand and agree that evernote , its subsidiaries , affiliates , service providers , and licensors , and our and their respective officers , employees , agents and successors shall not be liable to you for any direct , indirect , incidental , special , consequential or exemplary damages , including but not limited to , damages for loss of profits , goodwill , use , data , cover or other intangible losses -lrb- even if evernote has been advised of the possibility of such damages -rrb- resulting from : -lrb- i -rrb- the use or the inability to use the service or to use promotional codes or evernote points ; -lrb- ii -rrb- the cost of procurement of substitute services resulting from any data , information or service purchased or obtained or messages received or transactions entered into through or from the service ; -lrb- iii -rrb- unauthorized access to or the loss , corruption or alteration of your transmissions , content or data ; -lrb- iv -rrb- statements or conduct of any third party on or using the service , or providing any services related to the operation of the service ; -lrb- v -rrb- evernote 's actions or omissions in reliance upon your basic subscriber information and any changes thereto or notices received therefrom ; -lrb- vi -rrb- your failure to protect the confidentiality of any passwords or access rights to your account ; -lrb- vii -rrb- the acts or omissions of any third party using or integrating with the service ; -lrb- viii -rrb- any advertising content or your purchase or use of any advertised or other third-party product or service ; -lrb- ix -rrb- the termination of your account in accordance with the terms of these terms of service ; or -lrb- x -rrb- any other matter relating to the service .
since the clause states that the provider is not liable for any information stored or processed within the Services, inaccuracies or error of information, content and material posted, software, products and services on the website, including copyright violation, defamation, slander, libel, falsehoods, obscenity, pornography, profanity, or objectionable material
1
to the fullest extent permitted by law , badoo expressly excludes :
since the clause states that the provider is not liable even if he was, or should have been, aware or have been advised about the possibility of any damage or loss
1
notwithstanding any other remedies available to truecaller , you agree that truecaller may suspend or terminate your use of the services without notice if you use the services or the content in any prohibited manner , and that such use will be deemed a material breach of these terms .
since the clause generally states the contract or access may be terminated in an event of a force majeure, act of God or other unforeseen events of a similar nature.
0
- Loss:
OnlineContrastiveLoss
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 16per_device_eval_batch_size
: 16learning_rate
: 2e-05num_train_epochs
: 2warmup_ratio
: 0.1fp16
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 2max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falsebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | loss | eval_max_ap |
---|---|---|---|---|
0 | 0 | - | - | 0.6125 |
0.2564 | 100 | 0.9286 | 0.4118 | 0.8794 |
0.5128 | 200 | 0.3916 | 0.2868 | 0.9177 |
0.7692 | 300 | 0.3414 | 0.2412 | 0.9448 |
1.0256 | 400 | 0.2755 | 0.2103 | 0.9470 |
1.2821 | 500 | 0.1893 | 0.1892 | 0.9486 |
1.5385 | 600 | 0.1557 | 0.1709 | 0.9548 |
1.7949 | 700 | 0.1566 | 0.1888 | 0.9479 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.1.1
- Transformers: 4.45.2
- PyTorch: 2.5.1+cu121
- Accelerate: 1.1.1
- Datasets: 3.1.0
- Tokenizers: 0.20.3
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
- Downloads last month
- 88
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for cruzlorite/all-mpnet-base-v2-unfair-tos-rationale
Base model
sentence-transformers/all-mpnet-base-v2Evaluation results
- Cosine Accuracy on evalself-reported0.889
- Cosine Accuracy Threshold on evalself-reported0.739
- Cosine F1 on evalself-reported0.897
- Cosine F1 Threshold on evalself-reported0.728
- Cosine Precision on evalself-reported0.861
- Cosine Recall on evalself-reported0.936
- Cosine Ap on evalself-reported0.947
- Dot Accuracy on evalself-reported0.889
- Dot Accuracy Threshold on evalself-reported0.739
- Dot F1 on evalself-reported0.897