ParsBERT-nli-FarsTail-FarSick
This model is a fine-tuned version of HooshvareLab/bert-fa-zwnj-base on the FarsTail and FarSick datasets. It achieves the following results on the evaluation set:
- Loss: 0.8730
- Accuracy: 0.8055
- Precision (macro): 0.7900
- Precision (micro): 0.8055
- Recall (macro): 0.7926
- Recall (micro): 0.7926
- F1 (macro): 0.7909
- F1 (micro): 0.8055
How to use
import torch
import transformers
model_name_or_path = "parsi-ai-nlpclass/ParsBERT-nli-FarsTail-FarSick"
config = transformers.AutoConfig.from_pretrained(model_name_or_path)
tokenizer_pb = transformers.AutoTokenizer.from_pretrained(model_name_or_path)
model_pb = transformers.AutoModelForSequenceClassification.from_pretrained(model_name_or_path,
num_labels=3)
premise = "سلام خوبی؟"
hypothesis = "آره خوبم"
print(model_pb(**tokenizer_pb(premise, hypothesis, return_tensors='pt')))
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision (macro) | Precision (micro) | Recall (macro) | Recall (micro) | F1 (macro) | F1 (micro) |
---|---|---|---|---|---|---|---|---|---|---|
0.6248 | 1.0 | 1137 | 0.5391 | 0.7768 | 0.7677 | 0.7768 | 0.7728 | 0.7728 | 0.7647 | 0.7768 |
0.4449 | 2.0 | 2274 | 0.5017 | 0.8055 | 0.7909 | 0.8055 | 0.7963 | 0.7963 | 0.7932 | 0.8055 |
0.304 | 3.0 | 3411 | 0.5851 | 0.8125 | 0.8006 | 0.8125 | 0.7979 | 0.7979 | 0.7985 | 0.8125 |
0.1844 | 4.0 | 4548 | 0.7549 | 0.8140 | 0.8010 | 0.8140 | 0.7982 | 0.7982 | 0.7993 | 0.8140 |
0.1224 | 5.0 | 5685 | 0.8730 | 0.8055 | 0.7900 | 0.8055 | 0.7926 | 0.7926 | 0.7909 | 0.8055 |
Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.1
- Tokenizers 0.15.2
- Downloads last month
- 17
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for parsi-ai-nlpclass/ParsBERT-nli-FarsTail-FarSick
Base model
HooshvareLab/bert-fa-zwnj-base