Edit model card

DistilBERT for Zero Shot Classification

This repository contains a DistilBERT model trained for zero-shot classification on CNN articles. The model has been evaluated on CNN articles and achieved an accuracy of 0.956 and an F1 score of 0.955.

Model Details

  • Architecture: DistilBERT
  • Training Data: CNN articles
  • Accuracy: 0.956
  • F1 Score: 0.955

Usage

To use this model for zero-shot classification, you can follow the steps below:

  1. Load the trained model:

    
    from transformers import AutoTokenizer, AutoModelForSequenceClassification
    
     tokenizer = AutoTokenizer.from_pretrained("AyoubChLin/DistilBERT_ZeroShot")
    
     model = AutoModelForSequenceClassification.from_pretrained("AyoubChLin/DistilBERT_ZeroShot")
    
  2. Classify text using zero-shot classification:

    
    
       from transformers import pipeline
    
       # Create a zero-shot classification pipeline
       classifier = pipeline("zero-shot-classification", model=model, tokenizer=tokenizer)
    
       # Classify a sentence
       sentence = "The latest scientific breakthroughs in medicine"
       candidate_labels = ["politics", "sports", "technology", "business"]
    
       result = classifier(sentence, candidate_labels)
    
       print(result)
    

    The output will be a dictionary containing the classified label and the corresponding classification score.

About the Author

This work was created by Ayoub Cherguelaine.

If you have any questions or suggestions regarding this repository or the trained model, feel free to reach out to Ayoub Cherguelaine.

Downloads last month
150
Safetensors
Model size
67M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train AyoubChLin/DistilBERT_ZeroShot