Edit model card

SetFit with sentence-transformers/paraphrase-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A SetFitHead instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
1
  • 'I consider that is more convenient to drive a car because you carry on more things in your own car than travelling by car .'
  • 'In the last few years forensic biology has developed many aspects like better sensibility , robustness of results and less time required for analyze a sample , but what struck me most is how fast this change happens .'
  • "The car is n't the best way for for the transport , because it produce much pollution , however the public transport is better to do a journey ."
6
  • 'On the one hand travel by car are really much more convenient as give the chance to you to be independent .'
  • 'When most people think about an important historical place in Italy , they think of Duomo , in Milano .'
  • 'I like personality with childlike , so I like children .'
5
  • 'Yours sincerely ,'
  • 'This practice is considered those activities that anyone can do without any kind of special preparation .'
  • 'Secondly , the public vehicle route are more far than usual route .'
7
  • 'This conclusion become more prominent if we look into the data of the car companies and exponential growth in their sales figure and with low budget private cars in picture , scenario ddrastically changed in past 10 years'
  • 'Recently I saw the thriller of mokingjay part 2 .'
  • "An example of that is the marriage of homosexual where some state admit this marriage , others do n't ."
3
  • 'After that , the sports day began formally .'
  • 'In those years I lived the worst moments in my life .'
  • 'On the one hand , in my country there are a lot of place to travel .'
2
  • "Sharing houses or rooms have many advantages such as , cheap , safe , close to the university , and learn how to share everything with others . saving money and time will be more Obvious in university dormitories because monthly payments will be less than four times than hiring an apartment , and because it will be closer to the university , saving money and time is more efficient by reducing transportation 's costs"
  • 'So , finally I suggest that it would be a great idea to combine the different types of activities , both popular and the newest .'
  • 'Wszysycy residents of my village , they try to , so that our village was clear that pollute the environment as little as possible .'
4
  • 'During summer I love to go to the beach and having sunbathing with my friends other than getting fun with them playing volleyball or run inside the water of the sea !'
  • 'Jose is the best song . he is singing and talking in the party .'
  • "She fell sleep again , didn't she ?"
0
  • 'I work for the same large company for 25 years , now is the time to change and find new job opportunities .'
  • 'A problem which was caused by us , human beings , with their target of making money without thinking of the effects .'
  • 'He was waiting 2 hours for her .'

Evaluation

Metrics

Label Accuracy
all 0.175

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("HelgeKn/BEA2019-multi-class-20")
# Run inference
preds = model("Dear sir Dimara .")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 3 22.0 82
Label Training Sample Count
0 20
1 20
2 20
3 20
4 20
5 20
6 20
7 20

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (2, 2)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 20
  • body_learning_rate: (2e-05, 2e-05)
  • head_learning_rate: 2e-05
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0025 1 0.3724 -
0.125 50 0.2732 -
0.25 100 0.3001 -
0.375 150 0.2525 -
0.5 200 0.1934 -
0.625 250 0.1164 -
0.75 300 0.0874 -
0.875 350 0.0624 -
1.0 400 0.052 -
1.125 450 0.0569 -
1.25 500 0.0248 -
1.375 550 0.0071 -
1.5 600 0.0124 -
1.625 650 0.0087 -
1.75 700 0.0086 -
1.875 750 0.066 -
2.0 800 0.0194 -

Framework Versions

  • Python: 3.9.13
  • SetFit: 1.0.1
  • Sentence Transformers: 2.2.2
  • Transformers: 4.36.0
  • PyTorch: 2.1.1+cpu
  • Datasets: 2.15.0
  • Tokenizers: 0.15.0

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
6
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for HelgeKn/BEA2019-multi-class-20

Finetuned
(246)
this model

Evaluation results