Edit model card

pijarcandra22/t5Bali2Indo

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.4589
  • Validation Loss: 1.5981
  • Epoch: 97

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
0.5701 1.5227 0
0.5700 1.5142 1
0.5690 1.5212 2
0.5623 1.5221 3
0.5686 1.5265 4
0.5592 1.5261 5
0.5619 1.5208 6
0.5615 1.5224 7
0.5679 1.5230 8
0.5630 1.5250 9
0.5621 1.5238 10
0.5617 1.5270 11
0.5520 1.5271 12
0.5530 1.5347 13
0.5578 1.5278 14
0.5497 1.5280 15
0.5513 1.5333 16
0.5506 1.5371 17
0.5504 1.5337 18
0.5499 1.5374 19
0.5436 1.5405 20
0.5420 1.5382 21
0.5462 1.5377 22
0.5402 1.5367 23
0.5422 1.5345 24
0.5408 1.5385 25
0.5434 1.5378 26
0.5343 1.5381 27
0.5368 1.5404 28
0.5410 1.5407 29
0.5368 1.5417 30
0.5344 1.5431 31
0.5343 1.5428 32
0.5343 1.5454 33
0.5300 1.5499 34
0.5325 1.5505 35
0.5269 1.5427 36
0.5217 1.5493 37
0.5197 1.5560 38
0.5247 1.5520 39
0.5200 1.5557 40
0.5270 1.5551 41
0.5241 1.5518 42
0.5163 1.5492 43
0.5227 1.5520 44
0.5221 1.5552 45
0.5123 1.5523 46
0.5173 1.5572 47
0.5194 1.5571 48
0.5159 1.5566 49
0.5137 1.5591 50
0.5127 1.5533 51
0.5094 1.5516 52
0.5095 1.5574 53
0.5023 1.5609 54
0.5040 1.5604 55
0.5019 1.5650 56
0.5093 1.5577 57
0.5050 1.5592 58
0.5069 1.5623 59
0.4998 1.5635 60
0.4936 1.5674 61
0.4997 1.5651 62
0.4970 1.5648 63
0.4927 1.5651 64
0.4933 1.5719 65
0.4951 1.5699 66
0.4963 1.5690 67
0.4906 1.5728 68
0.4927 1.5740 69
0.4884 1.5763 70
0.4917 1.5766 71
0.4854 1.5740 72
0.4793 1.5741 73
0.4824 1.5790 74
0.4830 1.5760 75
0.4842 1.5784 76
0.4786 1.5794 77
0.4815 1.5733 78
0.4791 1.5800 79
0.4784 1.5796 80
0.4743 1.5835 81
0.4766 1.5832 82
0.4767 1.5814 83
0.4800 1.5832 84
0.4787 1.5847 85
0.4681 1.5849 86
0.4727 1.5875 87
0.4716 1.5838 88
0.4686 1.5849 89
0.4708 1.5851 90
0.4697 1.5911 91
0.4705 1.5910 92
0.4695 1.5934 93
0.4670 1.5914 94
0.4643 1.5969 95
0.4636 1.5945 96
0.4589 1.5981 97

Framework versions

  • Transformers 4.35.2
  • TensorFlow 2.14.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for pijarcandra22/t5Bali2Indo

Base model

google-t5/t5-small
Finetuned
(1503)
this model