trainer9
This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.9622
- Precision: 0.7086
- Recall: 0.6667
- F1: 0.6491
- Accuracy: 0.6667
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.0066 | 0.57 | 30 | 2.1004 | 0.6594 | 0.6071 | 0.5923 | 0.6071 |
| 0.0014 | 1.13 | 60 | 2.6059 | 0.5319 | 0.6071 | 0.5515 | 0.6071 |
| 0.0006 | 1.7 | 90 | 2.1137 | 0.6915 | 0.6548 | 0.6539 | 0.6548 |
| 0.0003 | 2.26 | 120 | 2.1477 | 0.7482 | 0.7262 | 0.7148 | 0.7262 |
| 0.0002 | 2.83 | 150 | 2.2564 | 0.7218 | 0.6905 | 0.6836 | 0.6905 |
| 0.0002 | 3.4 | 180 | 2.3225 | 0.7218 | 0.6905 | 0.6836 | 0.6905 |
| 0.0001 | 3.96 | 210 | 2.3726 | 0.7218 | 0.6905 | 0.6836 | 0.6905 |
| 0.0001 | 4.53 | 240 | 2.4210 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0001 | 5.09 | 270 | 2.4925 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0001 | 5.66 | 300 | 2.5173 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0001 | 6.23 | 330 | 2.5448 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0001 | 6.79 | 360 | 2.5671 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0001 | 7.36 | 390 | 2.5925 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0001 | 7.92 | 420 | 2.6186 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0001 | 8.49 | 450 | 2.6545 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0 | 9.06 | 480 | 2.6665 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0 | 9.62 | 510 | 2.6921 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 10.19 | 540 | 2.7431 | 0.7133 | 0.6667 | 0.6497 | 0.6667 |
| 0.0 | 10.75 | 570 | 2.7458 | 0.7133 | 0.6667 | 0.6497 | 0.6667 |
| 0.0 | 11.32 | 600 | 2.7529 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 11.89 | 630 | 2.7619 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 12.45 | 660 | 2.7590 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 13.02 | 690 | 2.7805 | 0.7133 | 0.6667 | 0.6497 | 0.6667 |
| 0.0 | 13.58 | 720 | 2.7885 | 0.7133 | 0.6667 | 0.6497 | 0.6667 |
| 0.0 | 14.15 | 750 | 2.8009 | 0.7133 | 0.6667 | 0.6497 | 0.6667 |
| 0.0 | 14.72 | 780 | 2.8137 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 15.28 | 810 | 2.8259 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 15.85 | 840 | 2.8359 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 16.42 | 870 | 2.8432 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 16.98 | 900 | 2.8571 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 17.55 | 930 | 2.8652 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 18.11 | 960 | 2.8659 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 18.68 | 990 | 2.8728 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 19.25 | 1020 | 2.8805 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 19.81 | 1050 | 2.8874 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 20.38 | 1080 | 2.8919 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 20.94 | 1110 | 2.8988 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 21.51 | 1140 | 2.9038 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 22.08 | 1170 | 2.9073 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0 | 22.64 | 1200 | 2.9123 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0 | 23.21 | 1230 | 2.9166 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0 | 23.77 | 1260 | 2.9249 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0 | 24.34 | 1290 | 2.9307 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 24.91 | 1320 | 2.9341 | 0.7147 | 0.6786 | 0.6671 | 0.6786 |
| 0.0 | 25.47 | 1350 | 2.9471 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 26.04 | 1380 | 2.9527 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 26.6 | 1410 | 2.9548 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 27.17 | 1440 | 2.9568 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 27.74 | 1470 | 2.9585 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 28.3 | 1500 | 2.9596 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 28.87 | 1530 | 2.9614 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 29.43 | 1560 | 2.9620 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
| 0.0 | 30.0 | 1590 | 2.9622 | 0.7086 | 0.6667 | 0.6491 | 0.6667 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 3
Model tree for SimoneJLaudani/trainer9
Base model
distilbert/distilbert-base-uncased