update model card README.md
Browse files
README.md
CHANGED
@@ -3,7 +3,6 @@ license: bsd-3-clause
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
metrics:
|
6 |
-
- rouge
|
7 |
- bleu
|
8 |
model-index:
|
9 |
- name: CommitPredictorT5
|
@@ -17,13 +16,13 @@ should probably proofread and complete it, then remove this comment. -->
|
|
17 |
|
18 |
This model is a fine-tuned version of [Salesforce/codet5-base-multi-sum](https://huggingface.co/Salesforce/codet5-base-multi-sum) on the None dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
-
- Loss: 2.
|
21 |
-
-
|
22 |
-
-
|
23 |
-
-
|
24 |
-
-
|
25 |
-
-
|
26 |
-
-
|
27 |
|
28 |
## Model description
|
29 |
|
@@ -43,36 +42,33 @@ More information needed
|
|
43 |
|
44 |
The following hyperparameters were used during training:
|
45 |
- learning_rate: 2e-05
|
46 |
-
- train_batch_size:
|
47 |
-
- eval_batch_size:
|
48 |
- seed: 42
|
|
|
|
|
49 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
50 |
- lr_scheduler_type: linear
|
51 |
- num_epochs: 100
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
-
| Training Loss | Epoch | Step
|
56 |
-
|
57 |
-
|
|
58 |
-
|
|
59 |
-
|
|
60 |
-
| 2.
|
61 |
-
| 2.
|
62 |
-
|
|
63 |
-
|
|
64 |
-
|
|
65 |
-
|
|
66 |
-
|
|
67 |
-
|
|
68 |
-
| 1.
|
69 |
-
| 1.
|
70 |
-
| 1.
|
71 |
-
| 1.1952 | 15.0 | 12555 | 2.6149 | 0.0001 | 0.0 | 0.0001 | 0.0001 | 1.0 | 0.0003 |
|
72 |
-
| 1.122 | 16.0 | 13392 | 2.6565 | 0.0001 | 0.0 | 0.0001 | 0.0001 | 1.0 | 0.0003 |
|
73 |
-
| 1.0543 | 17.0 | 14229 | 2.6823 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0003 |
|
74 |
-
| 1.0017 | 18.0 | 15066 | 2.7106 | 0.0001 | 0.0 | 0.0001 | 0.0001 | 1.0 | 0.0003 |
|
75 |
-
| 0.9437 | 19.0 | 15903 | 2.7383 | 0.0001 | 0.0 | 0.0001 | 0.0001 | 1.0 | 0.0003 |
|
76 |
|
77 |
|
78 |
### Framework versions
|
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
metrics:
|
|
|
6 |
- bleu
|
7 |
model-index:
|
8 |
- name: CommitPredictorT5
|
|
|
16 |
|
17 |
This model is a fine-tuned version of [Salesforce/codet5-base-multi-sum](https://huggingface.co/Salesforce/codet5-base-multi-sum) on the None dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 2.4669
|
20 |
+
- Bleu: 0.0002
|
21 |
+
- Precisions: [0.003189792663476874, 0.00016826518593303046, 0.000321853878339234, 0.0036900369003690036]
|
22 |
+
- Brevity Penalty: 0.2394
|
23 |
+
- Length Ratio: 0.4116
|
24 |
+
- Translation Length: 10658
|
25 |
+
- Reference Length: 25896
|
26 |
|
27 |
## Model description
|
28 |
|
|
|
42 |
|
43 |
The following hyperparameters were used during training:
|
44 |
- learning_rate: 2e-05
|
45 |
+
- train_batch_size: 42
|
46 |
+
- eval_batch_size: 42
|
47 |
- seed: 42
|
48 |
+
- gradient_accumulation_steps: 3
|
49 |
+
- total_train_batch_size: 126
|
50 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
51 |
- lr_scheduler_type: linear
|
52 |
- num_epochs: 100
|
53 |
|
54 |
### Training results
|
55 |
|
56 |
+
| Training Loss | Epoch | Step | Validation Loss | Bleu | Precisions | Brevity Penalty | Length Ratio | Translation Length | Reference Length |
|
57 |
+
|:-------------:|:-----:|:----:|:---------------:|:------:|:--------------------------------------------------------------------------------------------:|:---------------:|:------------:|:------------------:|:----------------:|
|
58 |
+
| No log | 1.0 | 299 | 2.8109 | 0.0002 | [0.003640040444893832, 0.00019327406262079628, 0.0003745318352059925, 0.006024096385542169] | 0.1982 | 0.3819 | 9889 | 25896 |
|
59 |
+
| 3.1102 | 2.0 | 598 | 2.6662 | 0.0002 | [0.004371150407311742, 0.00018691588785046728, 0.00036114120621162876, 0.005319148936170213] | 0.2074 | 0.3887 | 10065 | 25896 |
|
60 |
+
| 3.1102 | 3.0 | 897 | 2.5869 | 0.0002 | [0.0033418517790446234, 0.00018321729571271528, 0.0003546099290780142, 0.005494505494505495] | 0.2132 | 0.3928 | 10173 | 25896 |
|
61 |
+
| 2.6696 | 4.0 | 1196 | 2.5371 | 0.0002 | [0.0033398821218074658, 0.00018301610541727673, 0.0003522367030644593, 0.004672897196261682] | 0.2135 | 0.3931 | 10179 | 25896 |
|
62 |
+
| 2.6696 | 5.0 | 1495 | 2.5077 | 0.0002 | [0.003243655790879603, 0.0001734304543877905, 0.0003356831151393085, 0.005208333333333333] | 0.2298 | 0.4047 | 10481 | 25896 |
|
63 |
+
| 2.4738 | 6.0 | 1794 | 2.4810 | 0.0002 | [0.0029016345874842827, 0.00017784101013693757, 0.00034234851078397807, 0.0045662100456621] | 0.2220 | 0.3992 | 10338 | 25896 |
|
64 |
+
| 2.3139 | 7.0 | 2093 | 2.4625 | 0.0002 | [0.002756130013305455, 0.0001722356183258698, 0.00033101621979476995, 0.00423728813559322] | 0.2319 | 0.4063 | 10521 | 25896 |
|
65 |
+
| 2.3139 | 8.0 | 2392 | 2.4556 | 0.0002 | [0.0027348170501697473, 0.00016983695652173913, 0.0003266906239790918, 0.004273504273504274] | 0.2364 | 0.4094 | 10603 | 25896 |
|
66 |
+
| 2.1842 | 9.0 | 2691 | 2.4470 | 0.0002 | [0.003198193961057285, 0.000169061707523246, 0.00032658393207054214, 0.004784688995215311] | 0.2378 | 0.4105 | 10630 | 25896 |
|
67 |
+
| 2.1842 | 10.0 | 2990 | 2.4439 | 0.0002 | [0.0033203680865193054, 0.00017167381974248928, 0.000328515111695138, 0.0038022813688212928] | 0.2330 | 0.4070 | 10540 | 25896 |
|
68 |
+
| 2.0831 | 11.0 | 3289 | 2.4435 | 0.0002 | [0.0032796101949025486, 0.000167897918065816, 0.000321853878339234, 0.003875968992248062] | 0.2401 | 0.4121 | 10671 | 25896 |
|
69 |
+
| 1.9685 | 12.0 | 3588 | 2.4483 | 0.0002 | [0.0037652056381540836, 0.0001772421127259837, 0.0003397893306150187, 0.004098360655737705] | 0.2231 | 0.3999 | 10357 | 25896 |
|
70 |
+
| 1.9685 | 13.0 | 3887 | 2.4557 | 0.0002 | [0.0033178500331785005, 0.00017143836790673754, 0.000327653997378768, 0.0036900369003690036] | 0.2334 | 0.4073 | 10548 | 25896 |
|
71 |
+
| 1.8816 | 14.0 | 4186 | 2.4669 | 0.0002 | [0.003189792663476874, 0.00016826518593303046, 0.000321853878339234, 0.0036900369003690036] | 0.2394 | 0.4116 | 10658 | 25896 |
|
|
|
|
|
|
|
|
|
|
|
72 |
|
73 |
|
74 |
### Framework versions
|