metadata
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: CommitPredictor
results: []
CommitPredictor
This model is a fine-tuned version of microsoft/codebert-base-mlm on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5096
- Accuracy: 0.8933
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 21
- eval_batch_size: 21
- seed: 42
- gradient_accumulation_steps: 3
- total_train_batch_size: 63
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.1808 | 1.0 | 599 | 0.7826 | 0.8420 |
0.8381 | 2.0 | 1198 | 0.7008 | 0.8581 |
0.7733 | 3.0 | 1797 | 0.6717 | 0.8639 |
0.7416 | 4.0 | 2396 | 0.6460 | 0.8682 |
0.7143 | 5.0 | 2995 | 0.6331 | 0.8708 |
0.683 | 6.0 | 3594 | 0.6243 | 0.8723 |
0.6609 | 7.0 | 4193 | 0.6151 | 0.8744 |
0.6547 | 8.0 | 4792 | 0.5987 | 0.8765 |
0.6467 | 9.0 | 5391 | 0.5969 | 0.8776 |
0.6366 | 10.0 | 5990 | 0.5890 | 0.8786 |
0.6176 | 11.0 | 6589 | 0.5785 | 0.8801 |
0.6106 | 12.0 | 7188 | 0.5813 | 0.8803 |
0.6026 | 13.0 | 7787 | 0.5644 | 0.8834 |
0.6005 | 14.0 | 8386 | 0.5600 | 0.8841 |
0.5965 | 15.0 | 8985 | 0.5653 | 0.8832 |
0.5851 | 16.0 | 9584 | 0.5544 | 0.8850 |
0.5781 | 17.0 | 10183 | 0.5543 | 0.8849 |
0.5732 | 18.0 | 10782 | 0.5464 | 0.8862 |
0.5713 | 19.0 | 11381 | 0.5448 | 0.8860 |
0.5678 | 20.0 | 11980 | 0.5452 | 0.8869 |
0.5615 | 21.0 | 12579 | 0.5395 | 0.8883 |
0.5543 | 22.0 | 13178 | 0.5383 | 0.8881 |
0.555 | 23.0 | 13777 | 0.5456 | 0.8870 |
0.5517 | 24.0 | 14376 | 0.5314 | 0.8890 |
0.5478 | 25.0 | 14975 | 0.5355 | 0.8878 |
0.5423 | 26.0 | 15574 | 0.5316 | 0.8892 |
0.5402 | 27.0 | 16173 | 0.5261 | 0.8903 |
0.5385 | 28.0 | 16772 | 0.5343 | 0.8884 |
0.5358 | 29.0 | 17371 | 0.5288 | 0.8894 |
0.5319 | 30.0 | 17970 | 0.5200 | 0.8912 |
0.5292 | 31.0 | 18569 | 0.5142 | 0.8923 |
0.529 | 32.0 | 19168 | 0.5174 | 0.8915 |
0.5233 | 33.0 | 19767 | 0.5253 | 0.8905 |
0.5236 | 34.0 | 20366 | 0.5135 | 0.8917 |
0.5269 | 35.0 | 20965 | 0.5127 | 0.8931 |
0.5145 | 36.0 | 21564 | 0.5182 | 0.8909 |
0.5192 | 37.0 | 22163 | 0.5185 | 0.8912 |
0.5154 | 38.0 | 22762 | 0.5160 | 0.8927 |
0.5131 | 39.0 | 23361 | 0.5135 | 0.8926 |
0.513 | 40.0 | 23960 | 0.5125 | 0.8924 |
0.5106 | 41.0 | 24559 | 0.5137 | 0.8919 |
0.5079 | 42.0 | 25158 | 0.5052 | 0.8935 |
0.508 | 43.0 | 25757 | 0.5172 | 0.8926 |
0.5104 | 44.0 | 26356 | 0.5062 | 0.8933 |
0.5066 | 45.0 | 26955 | 0.5076 | 0.8933 |
0.5085 | 46.0 | 27554 | 0.5123 | 0.8922 |
0.5064 | 47.0 | 28153 | 0.5102 | 0.8937 |
0.5058 | 48.0 | 28752 | 0.5127 | 0.8929 |
0.5028 | 49.0 | 29351 | 0.5164 | 0.8930 |
0.5036 | 50.0 | 29950 | 0.5096 | 0.8933 |
Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu117
- Datasets 2.7.1
- Tokenizers 0.13.2