checkpoints_2
This model is a fine-tuned version of microsoft/deberta-v3-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8543
- Map@3: 0.7167
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Map@3 |
---|---|---|---|---|
1.395 | 0.19 | 25 | 1.3859 | 0.5889 |
1.3803 | 0.37 | 50 | 1.3840 | 0.6958 |
1.3842 | 0.56 | 75 | 1.3314 | 0.7194 |
1.2795 | 0.74 | 100 | 1.0021 | 0.7222 |
0.9662 | 0.93 | 125 | 0.9006 | 0.6597 |
0.9574 | 1.11 | 150 | 0.8355 | 0.6903 |
0.8909 | 1.3 | 175 | 0.8506 | 0.6750 |
0.8077 | 1.48 | 200 | 0.8180 | 0.7125 |
0.955 | 1.67 | 225 | 0.8069 | 0.7097 |
0.8664 | 1.85 | 250 | 0.8186 | 0.7028 |
0.9396 | 2.04 | 275 | 0.8091 | 0.6986 |
0.8141 | 2.22 | 300 | 0.8212 | 0.7083 |
0.7898 | 2.41 | 325 | 0.8531 | 0.7167 |
0.9143 | 2.59 | 350 | 0.8482 | 0.7125 |
0.8861 | 2.78 | 375 | 0.8229 | 0.7083 |
0.8569 | 2.96 | 400 | 0.8372 | 0.7181 |
0.8381 | 3.15 | 425 | 0.8516 | 0.7153 |
0.7671 | 3.33 | 450 | 0.8458 | 0.7167 |
0.8704 | 3.52 | 475 | 0.8651 | 0.7222 |
0.8733 | 3.7 | 500 | 0.8356 | 0.7153 |
0.7309 | 3.89 | 525 | 0.8476 | 0.7181 |
0.7793 | 4.07 | 550 | 0.8566 | 0.7167 |
0.7849 | 4.26 | 575 | 0.8644 | 0.7167 |
0.7776 | 4.44 | 600 | 0.8584 | 0.7167 |
0.7573 | 4.63 | 625 | 0.8546 | 0.7167 |
0.8115 | 4.81 | 650 | 0.8543 | 0.7167 |
0.869 | 5.0 | 675 | 0.8543 | 0.7167 |
Framework versions
- Transformers 4.35.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.14.1
- Downloads last month
- 3
Inference API (serverless) does not yet support transformers models for this pipeline type.
Model tree for BachNgoH/checkpoints_2
Base model
microsoft/deberta-v3-large