---
library_name: setfit
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
metrics:
- accuracy
widget:
- text: 'The Alavas worked themselves to the bone in the last period , and English
and San Emeterio ( 65-75 ) had already made it clear that they were not going
to let anyone take away what they had earned during the first thirty minutes . '
- text: 'To break the uncomfortable silence , Haney began to talk . '
- text: 'For the treatment of non-small cell lung cancer , the effects of Alimta were
compared with those of docetaxel ( another anticancer medicine ) in one study
involving 571 patients with locally advanced or metastatic disease who had received
chemotherapy in the past . '
- text: 'As we all know , a few minutes before the end of the game ( that their team
had already won ) , both players deliberately wasted time which made the referee
show the second yellow card to both of them . '
- text: 'In contrast , patients whose cancer was affecting squamous cells had shorter
survival times if they received Alimta . '
pipeline_tag: text-classification
inference: true
base_model: sentence-transformers/paraphrase-mpnet-base-v2
model-index:
- name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.14172185430463577
name: Accuracy
---
# SetFit with sentence-transformers/paraphrase-mpnet-base-v2
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [SetFitHead](huggingface.co/docs/setfit/reference/main#setfit.SetFitHead) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2)
- **Classification head:** a [SetFitHead](huggingface.co/docs/setfit/reference/main#setfit.SetFitHead) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 7 classes
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 1 |
- 'Eventually little French farmers and their little French farmwives came out of their stone houses and put their hands above their tiny eyes and squinted at us . '
- 'Mr. Neuberger realized that , although of Italian ancestry , Mr. Mariotta still could qualify as a minority person since he was born in Puerto Rico . '
- "Biggest trouble was scared family who could n't get a phone line through , and spent a really horrible hour not knowing . "
|
| 4 | - '`` To ring for even one service at this tower , we have to scrape , `` says Mr. Hammond , a retired water-authority worker . `` '
- "`` It 's my line of work `` , he said "
- 'One writer , signing his letter as `` Red-blooded , balanced male , `` remarked on the `` frequency of women fainting in peals , `` and suggested that they `` settle back into their traditional role of making tea at meetings . `` '
|
| 5 | - 'Of course On Thursday , Haney mailed the monthly check for separate maintenance to his wife Lolly , and wished the stranger could do something about her '
- "On the Right , the tone was set by Jacques Chirac , who declared in 1976 that `` 900,000 unemployed would not become a problem in a country with 2 million of foreign workers , '' and on the Left by Michel Rocard explaining in 1990 that France `` can not accommodate all the world 's misery . '' "
- "But the council 's program to attract and train ringers is only partly successful , says Mr. Baldwin . "
|
| 6 | - '3 -RRB- Republican congressional representatives , because of their belief in a minimalist state , are less willing to engage in local benefit-seeking than are Democratic members of Congress . '
- 'As we know , voters tend to favor Republicans more in races for president than in those for Congress . '
- 'That is the way the system works . '
|
| 2 | - '-- Students should move up the educational ladder as their academic potential allows . '
- 'The next day , Sunday , the hangover reminded Haney where he had been the night before . '
- '-- In most states , increasing expenditures on education , in our current circumstances , will probably make things worse , not better . '
|
| 0 | - 'Then your focus will go to an input text box where you can type your function . '
- "I might have got hit by that truck if it was n't for you . "
- "Second , it explains why voters hold Congress in disdain but generally love their own congressional representatives : Any individual legislator 's constituents appreciate the specific benefits that the legislator wins for them but not the overall cost associated with every other legislator doing likewise for his own constituency . "
|
| 3 | - "It was the most exercise we 'd had all morning and it was followed by our driving immediately to the nearest watering hole . "
- 'Alimta is used together with cisplatin ( another anticancer medicine ) when the cancer is unresectable ( cannot be removed by surgery alone ) and malignant ( has spread , or is likely to spread easily , to other parts of the body ) , in patients who have not received chemotherapy ( medicines for cancer ) before advanced or metastatic non-small cell lung cancer that is not affecting the squamous cells . '
- 'If it is , it will be treated as an operator , if it is not , it will be treated as a user function . '
|
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.1417 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("HelgeKn/SemEval-multi-class-4")
# Run inference
preds = model("To break the uncomfortable silence , Haney began to talk . ")
```
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:----|
| Word count | 4 | 27.1786 | 74 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0 | 4 |
| 1 | 4 |
| 2 | 4 |
| 3 | 4 |
| 4 | 4 |
| 5 | 4 |
| 6 | 4 |
### Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (2, 2)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.0143 | 1 | 0.2446 | - |
| 0.7143 | 50 | 0.0612 | - |
| 1.4286 | 100 | 0.0078 | - |
### Framework Versions
- Python: 3.9.13
- SetFit: 1.0.1
- Sentence Transformers: 2.2.2
- Transformers: 4.36.0
- PyTorch: 2.1.1+cpu
- Datasets: 2.15.0
- Tokenizers: 0.15.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```