metadata
base_model: vinai/phobert-base-v2
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: phobert-base-v2-DACN1
results: []
phobert-base-v2-DACN1
This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5520
- Accuracy: 0.8784
- F1: 0.8781
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
No log | 0.2782 | 200 | 0.4102 | 0.8061 | 0.8037 |
No log | 0.5563 | 400 | 0.3549 | 0.8456 | 0.8461 |
No log | 0.8345 | 600 | 0.3583 | 0.8466 | 0.8454 |
0.4175 | 1.1127 | 800 | 0.3401 | 0.8537 | 0.8523 |
0.4175 | 1.3908 | 1000 | 0.3179 | 0.8639 | 0.8646 |
0.4175 | 1.6690 | 1200 | 0.3148 | 0.8687 | 0.8691 |
0.4175 | 1.9471 | 1400 | 0.3240 | 0.8574 | 0.8582 |
0.3061 | 2.2253 | 1600 | 0.3148 | 0.8734 | 0.8740 |
0.3061 | 2.5035 | 1800 | 0.3224 | 0.8742 | 0.8743 |
0.3061 | 2.7816 | 2000 | 0.3288 | 0.8678 | 0.8671 |
0.2524 | 3.0598 | 2200 | 0.3512 | 0.8767 | 0.8769 |
0.2524 | 3.3380 | 2400 | 0.3421 | 0.8798 | 0.8796 |
0.2524 | 3.6161 | 2600 | 0.3089 | 0.8795 | 0.8799 |
0.2524 | 3.8943 | 2800 | 0.3569 | 0.8718 | 0.8725 |
0.2123 | 4.1725 | 3000 | 0.3840 | 0.8747 | 0.8744 |
0.2123 | 4.4506 | 3200 | 0.3681 | 0.8729 | 0.8736 |
0.2123 | 4.7288 | 3400 | 0.3575 | 0.8725 | 0.8732 |
0.1771 | 5.0070 | 3600 | 0.3575 | 0.8793 | 0.8794 |
0.1771 | 5.2851 | 3800 | 0.4285 | 0.8758 | 0.8752 |
0.1771 | 5.5633 | 4000 | 0.3843 | 0.8778 | 0.8782 |
0.1771 | 5.8414 | 4200 | 0.3951 | 0.8780 | 0.8779 |
0.1479 | 6.1196 | 4400 | 0.4364 | 0.8734 | 0.8726 |
0.1479 | 6.3978 | 4600 | 0.4273 | 0.8753 | 0.8752 |
0.1479 | 6.6759 | 4800 | 0.4596 | 0.8786 | 0.8783 |
0.1479 | 6.9541 | 5000 | 0.4498 | 0.8784 | 0.8785 |
0.1284 | 7.2323 | 5200 | 0.4592 | 0.8793 | 0.8795 |
0.1284 | 7.5104 | 5400 | 0.4796 | 0.8755 | 0.8747 |
0.1284 | 7.7886 | 5600 | 0.4830 | 0.8729 | 0.8722 |
0.1068 | 8.0668 | 5800 | 0.4879 | 0.8789 | 0.8787 |
0.1068 | 8.3449 | 6000 | 0.5213 | 0.8767 | 0.8761 |
0.1068 | 8.6231 | 6200 | 0.5114 | 0.8769 | 0.8764 |
0.1068 | 8.9013 | 6400 | 0.5090 | 0.8778 | 0.8777 |
0.0946 | 9.1794 | 6600 | 0.5192 | 0.8800 | 0.8800 |
0.0946 | 9.4576 | 6800 | 0.5517 | 0.8753 | 0.8748 |
0.0946 | 9.7357 | 7000 | 0.5520 | 0.8784 | 0.8781 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1