hientptran's picture
Training complete
e20577b verified
metadata
library_name: transformers
base_model: hientptran/t5-small-finetuned-tifu
tags:
  - summarization
  - generated_from_trainer
metrics:
  - rouge
model-index:
  - name: t5-small-finetuned-tifu
    results: []

t5-small-finetuned-tifu

This model is a fine-tuned version of hientptran/t5-small-finetuned-tifu on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.9160
  • Rouge1: 21.4234
  • Rouge2: 5.1697
  • Rougel: 17.8628
  • Rougelsum: 18.3012
  • Gen Len: 19.4414

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
3.0712 1.0 2107 2.9372 20.9834 4.9652 17.5404 17.942 19.448
3.0523 2.0 4214 2.9216 21.3716 5.1541 17.7939 18.2228 19.4411
3.0546 3.0 6321 2.9160 21.4234 5.1697 17.8628 18.3012 19.4414

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0