DistilBERT fine-tuned for news classification
This model is based on distilbert-base-uncased pretrained weights, with a classification head fine-tuned to classify news articles into 3 categories (bad, medium, good).
Training data
The dataset used to fine-tune the model is news-small, the 300 article news dataset manually annotated by Alex.
Inputs
Similar to its base model, this model accepts inputs with a maximum length of 512 tokens.
- Downloads last month
- 14
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.