DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper
•
1910.01108
•
Published
•
21
This model is trained for toxicity classification task using. The dataset used for training is the dataset by Jigsaw ( Jigsaw 2020). We split it into two parts and fine-tune a DistilBERT model (DistilBERT base model (uncased) ) on it. DistilBERT is a distilled version of the BERT base model. It was introduced in this paper.
from transformers import pipeline
text = "This was a masterpiece. Not completely faithful to the books, but enthralling from beginning to end. Might be my favorite of the three."
classifier = pipeline("text-classification", model="tensor-trek/distilbert-toxicity-classifier")
classifier(text)