Update README.md
Browse files
README.md
CHANGED
@@ -16,11 +16,20 @@ This Hugging Face repository contains a fine-tuned DistilBERT model for sentimen
|
|
16 |
- **Input Format**: Text-based clothing reviews
|
17 |
- **Output Format**: Sentiment category labels
|
18 |
|
|
|
|
|
|
|
19 |
|
20 |
## Training result
|
21 |
-
It
|
22 |
- **Validation Loss**: 1.1677
|
23 |
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
## Installation
|
26 |
|
@@ -32,7 +41,7 @@ To use this model, you'll need to install the Hugging Face Transformers library
|
|
32 |
## Usage
|
33 |
You can easily load the pre-trained model for sentiment analysis using Hugging Face's DistilBertForSequenceClassification and DistilBertTokenizerFast.
|
34 |
|
35 |
-
```
|
36 |
from transformers import DistilBertForSequenceClassification, DistilBertTokenizerFast
|
37 |
import torch
|
38 |
|
@@ -44,4 +53,4 @@ review = "This dress is amazing, I love it!"
|
|
44 |
inputs = tokenizer.encode(review, return_tensors="pt")
|
45 |
with torch.no_grad():
|
46 |
outputs = model(inputs)
|
47 |
-
predicted_class = int(torch.argmax(outputs.logits))
|
|
|
16 |
- **Input Format**: Text-based clothing reviews
|
17 |
- **Output Format**: Sentiment category labels
|
18 |
|
19 |
+
## Fine-tuning procedure
|
20 |
+
This model was fine-tuned using a relatively small dataset containing 23487 rows broken down into train/eval/test dataset. Nevertheless, the fine-tuned model was able to performs slightly better than the base-distilbert-model on the test dataset.
|
21 |
+
|
22 |
|
23 |
## Training result
|
24 |
+
It achieved the following results on the evaluation set:
|
25 |
- **Validation Loss**: 1.1677
|
26 |
|
27 |
+
### Comparison between the base distilbert model VS fine-tuned distilbert
|
28 |
+
| Model | Accuracy | Precision | Recall | F1 Score |
|
29 |
+
|--------------- | -------- | --------- | ------ | -------- |
|
30 |
+
| DistilBERT base model | 0.79 | 0.77 | 0.79 | 0.77 |
|
31 |
+
| DistilBERT fine-tuned | 0.85 | 0.86 | 0.85 | 0.85 |
|
32 |
+
|
33 |
|
34 |
## Installation
|
35 |
|
|
|
41 |
## Usage
|
42 |
You can easily load the pre-trained model for sentiment analysis using Hugging Face's DistilBertForSequenceClassification and DistilBertTokenizerFast.
|
43 |
|
44 |
+
```python
|
45 |
from transformers import DistilBertForSequenceClassification, DistilBertTokenizerFast
|
46 |
import torch
|
47 |
|
|
|
53 |
inputs = tokenizer.encode(review, return_tensors="pt")
|
54 |
with torch.no_grad():
|
55 |
outputs = model(inputs)
|
56 |
+
predicted_class = int(torch.argmax(outputs.logits))
|