PhoBert_Lexical_Dataset51KBoDuoiWithNewLexical
This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8197
- Accuracy: 0.8365
- F1: 0.8356
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
No log | 0.2506 | 200 | 0.7452 | 0.6740 | 0.6750 |
No log | 0.5013 | 400 | 0.6223 | 0.7292 | 0.7124 |
No log | 0.7519 | 600 | 0.5929 | 0.7394 | 0.7379 |
0.3501 | 1.0025 | 800 | 0.5602 | 0.7622 | 0.7502 |
0.3501 | 1.2531 | 1000 | 0.5534 | 0.7711 | 0.7628 |
0.3501 | 1.5038 | 1200 | 0.6296 | 0.7518 | 0.7517 |
0.3501 | 1.7544 | 1400 | 0.5476 | 0.7646 | 0.7562 |
0.2598 | 2.0050 | 1600 | 0.5547 | 0.7742 | 0.7672 |
0.2598 | 2.2556 | 1800 | 0.6056 | 0.7662 | 0.7628 |
0.2598 | 2.5063 | 2000 | 0.5986 | 0.7575 | 0.7566 |
0.2598 | 2.7569 | 2200 | 0.5618 | 0.7851 | 0.7795 |
0.2143 | 3.0075 | 2400 | 0.5639 | 0.7806 | 0.7783 |
0.2143 | 3.2581 | 2600 | 0.5837 | 0.7726 | 0.7643 |
0.2143 | 3.5088 | 2800 | 0.5915 | 0.7735 | 0.7724 |
0.2143 | 3.7594 | 3000 | 0.6132 | 0.7772 | 0.7735 |
0.184 | 4.0100 | 3200 | 0.5625 | 0.7946 | 0.7895 |
0.184 | 4.2607 | 3400 | 0.5947 | 0.7862 | 0.7841 |
0.184 | 4.5113 | 3600 | 0.5733 | 0.8033 | 0.7998 |
0.184 | 4.7619 | 3800 | 0.6023 | 0.7928 | 0.7882 |
0.1534 | 5.0125 | 4000 | 0.5951 | 0.7955 | 0.7901 |
0.1534 | 5.2632 | 4200 | 0.6342 | 0.7975 | 0.7953 |
0.1534 | 5.5138 | 4400 | 0.6433 | 0.8002 | 0.7982 |
0.1534 | 5.7644 | 4600 | 0.6160 | 0.8018 | 0.7998 |
0.1316 | 6.0150 | 4800 | 0.6199 | 0.8129 | 0.8102 |
0.1316 | 6.2657 | 5000 | 0.6368 | 0.8061 | 0.8043 |
0.1316 | 6.5163 | 5200 | 0.6319 | 0.8143 | 0.8099 |
0.1316 | 6.7669 | 5400 | 0.6837 | 0.7915 | 0.7900 |
0.1123 | 7.0175 | 5600 | 0.7237 | 0.8041 | 0.8036 |
0.1123 | 7.2682 | 5800 | 0.6456 | 0.8095 | 0.8079 |
0.1123 | 7.5188 | 6000 | 0.6659 | 0.8181 | 0.8152 |
0.1123 | 7.7694 | 6200 | 0.7378 | 0.8028 | 0.8021 |
0.0958 | 8.0201 | 6400 | 0.6836 | 0.8102 | 0.8095 |
0.0958 | 8.2707 | 6600 | 0.7123 | 0.8121 | 0.8122 |
0.0958 | 8.5213 | 6800 | 0.7342 | 0.8182 | 0.8163 |
0.0958 | 8.7719 | 7000 | 0.7296 | 0.8192 | 0.8178 |
0.0806 | 9.0226 | 7200 | 0.7005 | 0.8233 | 0.8208 |
0.0806 | 9.2732 | 7400 | 0.7088 | 0.8253 | 0.8237 |
0.0806 | 9.5238 | 7600 | 0.7216 | 0.8192 | 0.8185 |
0.0806 | 9.7744 | 7800 | 0.7438 | 0.8215 | 0.8205 |
0.0712 | 10.0251 | 8000 | 0.7037 | 0.8328 | 0.8315 |
0.0712 | 10.2757 | 8200 | 0.7506 | 0.8293 | 0.8282 |
0.0712 | 10.5263 | 8400 | 0.7582 | 0.8222 | 0.8215 |
0.0712 | 10.7769 | 8600 | 0.7381 | 0.8266 | 0.8258 |
0.0622 | 11.0276 | 8800 | 0.7813 | 0.8265 | 0.8251 |
0.0622 | 11.2782 | 9000 | 0.7565 | 0.8339 | 0.8330 |
0.0622 | 11.5288 | 9200 | 0.7879 | 0.8310 | 0.8307 |
0.0622 | 11.7794 | 9400 | 0.7770 | 0.8309 | 0.8305 |
0.0534 | 12.0301 | 9600 | 0.7488 | 0.8360 | 0.8353 |
0.0534 | 12.2807 | 9800 | 0.7980 | 0.8352 | 0.8340 |
0.0534 | 12.5313 | 10000 | 0.7541 | 0.8393 | 0.8381 |
0.0534 | 12.7820 | 10200 | 0.7996 | 0.8330 | 0.8324 |
0.0482 | 13.0326 | 10400 | 0.7863 | 0.8350 | 0.8343 |
0.0482 | 13.2832 | 10600 | 0.8185 | 0.8355 | 0.8349 |
0.0482 | 13.5338 | 10800 | 0.8225 | 0.8353 | 0.8346 |
0.0482 | 13.7845 | 11000 | 0.8023 | 0.8363 | 0.8355 |
0.0426 | 14.0351 | 11200 | 0.8098 | 0.8360 | 0.8352 |
0.0426 | 14.2857 | 11400 | 0.8205 | 0.8326 | 0.8319 |
0.0426 | 14.5363 | 11600 | 0.8161 | 0.8353 | 0.8344 |
0.0426 | 14.7870 | 11800 | 0.8197 | 0.8365 | 0.8356 |
Framework versions
- Transformers 4.43.1
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 0
Model tree for phunganhsang/PhoBert_Lexical_Dataset51KBoDuoiWithNewLexical
Base model
vinai/phobert-base-v2