Update README.md
Browse files
README.md
CHANGED
@@ -9,11 +9,10 @@ metrics:
|
|
9 |
model-index:
|
10 |
- name: ernie-2.0-base-en-Tweet_About_Disaster_Or_Not
|
11 |
results: []
|
|
|
|
|
12 |
---
|
13 |
|
14 |
-
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
15 |
-
should probably proofread and complete it, then remove this comment. -->
|
16 |
-
|
17 |
# ernie-2.0-base-en-Tweet_About_Disaster_Or_Not
|
18 |
|
19 |
This model is a fine-tuned version of [nghuyong/ernie-2.0-base-en](https://huggingface.co/nghuyong/ernie-2.0-base-en) on the None dataset.
|
@@ -26,15 +25,28 @@ It achieves the following results on the evaluation set:
|
|
26 |
|
27 |
## Model description
|
28 |
|
29 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
30 |
|
31 |
## Intended uses & limitations
|
32 |
|
33 |
-
|
|
|
|
|
34 |
|
35 |
## Training and evaluation data
|
36 |
|
37 |
-
|
38 |
|
39 |
## Training procedure
|
40 |
|
@@ -65,4 +77,4 @@ The following hyperparameters were used during training:
|
|
65 |
- Transformers 4.26.1
|
66 |
- Pytorch 1.13.1
|
67 |
- Datasets 2.9.0
|
68 |
-
- Tokenizers 0.12.1
|
|
|
9 |
model-index:
|
10 |
- name: ernie-2.0-base-en-Tweet_About_Disaster_Or_Not
|
11 |
results: []
|
12 |
+
language:
|
13 |
+
- en
|
14 |
---
|
15 |
|
|
|
|
|
|
|
16 |
# ernie-2.0-base-en-Tweet_About_Disaster_Or_Not
|
17 |
|
18 |
This model is a fine-tuned version of [nghuyong/ernie-2.0-base-en](https://huggingface.co/nghuyong/ernie-2.0-base-en) on the None dataset.
|
|
|
25 |
|
26 |
## Model description
|
27 |
|
28 |
+
This is a binary classification model to determine if tweet input samples are about a disaster or not.
|
29 |
+
|
30 |
+
For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Binary%20Classification/Transformer%20Comparison/Is%20This%20Tweet%20Referring%20to%20a%20Disaster%20or%20Not%3F%20-%20ERNIE.ipynb
|
31 |
+
|
32 |
+
### Associated Projects
|
33 |
+
This project is part of a comparison of multiple transformers. The others can be found at the following links:
|
34 |
+
|
35 |
+
- https://huggingface.co/DunnBC22/roberta-base-Tweet_About_Disaster_Or_Not
|
36 |
+
- https://huggingface.co/DunnBC22/deberta-v3-small-Tweet_About_Disaster_Or_Not
|
37 |
+
- https://huggingface.co/DunnBC22/albert-base-v2-Tweet_About_Disaster_Or_Not
|
38 |
+
- https://huggingface.co/DunnBC22/electra-base-emotion-Tweet_About_Disaster_Or_Not
|
39 |
+
- https://huggingface.co/DunnBC22/distilbert-base-uncased-Tweet_About_Disaster_Or_Not
|
40 |
|
41 |
## Intended uses & limitations
|
42 |
|
43 |
+
This model is intended to demonstrate my ability to solve a complex problem using technology.
|
44 |
+
|
45 |
+
The main limitation is the quality of the data source.
|
46 |
|
47 |
## Training and evaluation data
|
48 |
|
49 |
+
Dataset Source: https://www.kaggle.com/datasets/vstepanenko/disaster-tweets
|
50 |
|
51 |
## Training procedure
|
52 |
|
|
|
77 |
- Transformers 4.26.1
|
78 |
- Pytorch 1.13.1
|
79 |
- Datasets 2.9.0
|
80 |
+
- Tokenizers 0.12.1
|