mt5-base-ainu-kana
This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0804
- Chrf: 97.6659
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Chrf |
---|---|---|---|---|
0.3825 | 0.9999 | 4701 | 0.2094 | 93.1478 |
0.213 | 2.0 | 9403 | 0.1313 | 95.6819 |
0.1512 | 2.9999 | 14104 | 0.1083 | 96.3753 |
0.1164 | 4.0 | 18806 | 0.1005 | 96.6527 |
0.0979 | 4.9999 | 23507 | 0.0901 | 97.1337 |
0.0844 | 6.0 | 28209 | 0.0859 | 97.2226 |
0.0779 | 6.9999 | 32910 | 0.0807 | 97.4449 |
0.0574 | 8.0 | 37612 | 0.0817 | 97.5779 |
0.0544 | 8.9999 | 42313 | 0.0802 | 97.6210 |
0.0484 | 9.9989 | 47010 | 0.0804 | 97.6659 |
Framework versions
- Transformers 4.40.1
- Pytorch 2.1.2
- Datasets 2.19.0
- Tokenizers 0.19.1
- Downloads last month
- 53
Model tree for aynumosir/mt5-small-ainu-romanize
Base model
google/mt5-small