metadata
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
DistilBERT (uncased) for FaceNews Classification
This model is a classification model built by fine-tuning DistilBERT base model. This model was trained using fake-and-real-news-dataset for five epochs.
NOTE: This model is just a POC (proof-of-concept) for a fellowship I was applying for.
Intended uses & limitations
Note that this model is primarily aimed at classifying an article to either "Fake" or "Real".
How to use
Check this notebook on Kaggle.