File size: 770 Bytes
6035830
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- bookcorpus
- wikipedia
---

# DistilBERT (uncased) for FaceNews Classification

This model is a classification model built by fine-tuning
[DistilBERT base model](https://huggingface.co/distilbert-base-uncased).
This model was trained using
[fake-and-real-news-dataset](https://www.kaggle.com/clmentbisaillon/fake-and-real-news-dataset)
for five epochs.

> **NOTE:**
This model is just a POC (proof-of-concept) for a fellowship I was applying for.

## Intended uses & limitations

Note that this model is primarily aimed at classifying an article to either
"Fake" or "Real". 

### How to use

Check this [notebook](https://www.kaggle.com/code/mohamedanwarvic/fakenewsclassifier-fatima-fellowship) on Kaggle.