pakawadeep's picture
Training in progress epoch 9
0bce1f5
|
raw
history blame
2.82 kB
metadata
license: apache-2.0
base_model: google/mt5-small
tags:
  - generated_from_keras_callback
model-index:
  - name: pakawadeep/mt5-small-finetuned-ctfl-augmented_05
    results: []

pakawadeep/mt5-small-finetuned-ctfl-augmented_05

This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 1.4151
  • Validation Loss: 1.2718
  • Train Rouge1: 8.6987
  • Train Rouge2: 2.1782
  • Train Rougel: 8.6987
  • Train Rougelsum: 8.6987
  • Train Gen Len: 11.9455
  • Epoch: 9

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Train Rouge1 Train Rouge2 Train Rougel Train Rougelsum Train Gen Len Epoch
9.2952 2.6353 1.5618 0.0 1.5117 1.5288 16.5347 0
4.7507 1.8159 5.5776 0.2888 5.5611 5.5281 12.2475 1
3.4617 1.8004 4.7218 0.2888 4.7218 4.6723 11.2723 2
2.8272 1.7197 6.1410 0.8251 6.1410 6.0113 11.1634 3
2.4003 1.6328 7.7086 2.1782 7.7086 7.7086 11.7277 4
2.0952 1.5374 8.2037 2.1782 8.2037 8.2037 11.8713 5
1.8634 1.4405 8.2037 2.1782 8.2037 8.2037 11.9406 6
1.6782 1.3615 8.2037 2.1782 8.2037 8.2037 11.9307 7
1.5333 1.3046 8.6987 2.1782 8.6987 8.6987 11.9356 8
1.4151 1.2718 8.6987 2.1782 8.6987 8.6987 11.9455 9

Framework versions

  • Transformers 4.41.2
  • TensorFlow 2.15.0
  • Datasets 2.20.0
  • Tokenizers 0.19.1