|
--- |
|
language: en |
|
tags: |
|
- exbert |
|
license: apache-2.0 |
|
datasets: |
|
- bookcorpus |
|
- wikipedia |
|
--- |
|
|
|
# DistilBERT (uncased) for FaceNews Classification |
|
|
|
This model is a classification model built by fine-tuning |
|
[DistilBERT base model](https://huggingface.co/distilbert-base-uncased). |
|
This model was trained using |
|
[fake-and-real-news-dataset](https://www.kaggle.com/clmentbisaillon/fake-and-real-news-dataset) |
|
for five epochs. |
|
|
|
> **NOTE:** |
|
This model is just a POC (proof-of-concept) for a fellowship I was applying for. |
|
|
|
## Intended uses & limitations |
|
|
|
Note that this model is primarily aimed at classifying an article to either |
|
"Fake" or "Real". |
|
|
|
### How to use |
|
|
|
Check this [notebook](https://www.kaggle.com/code/mohamedanwarvic/fakenewsclassifier-fatima-fellowship) on Kaggle. |