Holmeister's picture
End of training
49fa9cf verified
|
raw
history blame
3.24 kB
metadata
license: other
base_model: boun-tabi-LMG/TURNA
tags:
  - generated_from_trainer
metrics:
  - rouge
  - bleu
model-index:
  - name: TURNA_TSATweets_cond_gen_no_instruction
    results: []

TURNA_TSATweets_cond_gen_no_instruction

This model is a fine-tuned version of boun-tabi-LMG/TURNA on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0704
  • Rouge1: 0.705
  • Rouge2: 0.095
  • Rougel: 0.705
  • Rougelsum: 0.705
  • Bleu: 0.0
  • Precisions: [0.705, 0.0, 0.0, 0.0]
  • Brevity Penalty: 1.0
  • Length Ratio: 1.0
  • Translation Length: 200
  • Reference Length: 200
  • Meteor: 0.3525
  • Score: 29.5
  • Num Edits: 59
  • Ref Length: 200.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bleu Precisions Brevity Penalty Length Ratio Translation Length Reference Length Meteor Score Num Edits Ref Length
No log 0.5 82 0.0794 0.6569 0.0586 0.6552 0.6569 0.0 [0.656896551724138, 0.0, 0.0, 0.0] 1.0 1.0 580 580 0.3284 34.3103 199 580.0
1.7625 1.0 164 0.0707 0.6897 0.0638 0.6897 0.6879 0.0 [0.6896551724137931, 0.0, 0.0, 0.0] 1.0 1.0 580 580 0.3448 31.0345 180 580.0
1.7625 1.5 246 0.0778 0.7017 0.0284 0.7017 0.7 0.0 [0.7017241379310345, 0.0, 0.0, 0.0] 1.0 1.0 580 580 0.3509 29.8276 173 580.0
0.0921 2.0 328 0.0724 0.7259 0.0362 0.7241 0.7224 0.0 [0.7241379310344828, 0.0, 0.0, 0.0] 1.0 1.0 580 580 0.3621 27.5862 160 580.0
0.0921 2.5 410 0.0948 0.6845 0.1224 0.6828 0.6828 0.0 [0.6827586206896552, 0.0, 0.0, 0.0] 1.0 1.0 580 580 0.3414 31.7241 184 580.0

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1