Edit model card

mt5-small-finetuned-mt5-small-poem4e

This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: nan

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 121 nan
No log 2.0 242 nan
No log 3.0 363 nan
No log 4.0 484 nan
0.0 5.0 605 nan
0.0 6.0 726 nan
0.0 7.0 847 nan
0.0 8.0 968 nan
0.0 9.0 1089 nan
0.0 10.0 1210 nan
0.0 11.0 1331 nan
0.0 12.0 1452 nan
0.0 13.0 1573 nan
0.0 14.0 1694 nan
0.0 15.0 1815 nan
0.0 16.0 1936 nan
0.0 17.0 2057 nan
0.0 18.0 2178 nan
0.0 19.0 2299 nan
0.0 20.0 2420 nan
0.0 21.0 2541 nan
0.0 22.0 2662 nan
0.0 23.0 2783 nan
0.0 24.0 2904 nan
0.0 25.0 3025 nan
0.0 26.0 3146 nan
0.0 27.0 3267 nan
0.0 28.0 3388 nan
0.0 29.0 3509 nan
0.0 30.0 3630 nan
0.0 31.0 3751 nan
0.0 32.0 3872 nan
0.0 33.0 3993 nan
0.0 34.0 4114 nan
0.0 35.0 4235 nan
0.0 36.0 4356 nan
0.0 37.0 4477 nan
0.0 38.0 4598 nan
0.0 39.0 4719 nan
0.0 40.0 4840 nan
0.0 41.0 4961 nan
0.0 42.0 5082 nan
0.0 43.0 5203 nan
0.0 44.0 5324 nan
0.0 45.0 5445 nan
0.0 46.0 5566 nan
0.0 47.0 5687 nan
0.0 48.0 5808 nan
0.0 49.0 5929 nan
0.0 50.0 6050 nan

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
300M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for shkna1368/mt5-small-finetuned-mt5-small-poem4e

Base model

google/mt5-small
Finetuned
(302)
this model