Edit model card

njogerera_translation_model_V1

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.5203
  • Bleu: 1.0305
  • Gen Len: 11.919

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 250 3.5681 0.7832 11.529
3.8908 2.0 500 3.5203 1.0305 11.919

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
10
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for vertigo23/njogerera_translation_model_V1

Base model

google-t5/t5-small
Finetuned
this model