File size: 5,395 Bytes
4998488
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
abbbfd7
 
 
4998488
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ef44962
 
3c76e0c
 
05c4cd5
 
 
 
f618378
 
ba6ffe4
 
056b10a
 
f4b0e6a
 
e557f9b
 
 
57f49be
 
bc67523
 
091fddf
 
1f3d799
 
5c14294
 
7e9c113
 
f32b905
 
7e15c9f
 
 
8d9dbe5
 
ba69c38
 
434c82f
 
 
d479cec
 
83e1d0a
 
847bd36
 
bafee95
 
3a02bc3
 
 
68b7f94
 
1598a42
 
b35376b
 
 
ddda58d
 
71bb813
 
 
39f8b75
 
217f876
 
be435a2
 
2ce9a2e
 
4de6d6c
 
543068b
 
7bc9aa2
 
f64d538
 
349ba7a
 
 
135bef7
 
 
a1b6262
 
ce50d66
 
d81f2e1
 
fd66563
 
abbbfd7
 
 
4998488
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
---
license: apache-2.0
base_model: t5-small
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/eng-limbu-t5-manual-002
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# bedus-creation/eng-limbu-t5-manual-002

This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 3.0687
- Validation Loss: 3.7774
- Epoch: 99

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 6.7285     | 5.8526          | 0     |
| 5.8608     | 5.3145          | 1     |
| 5.3625     | 5.0804          | 2     |
| 5.1012     | 4.9629          | 3     |
| 4.9323     | 4.8258          | 4     |
| 4.7733     | 4.7266          | 5     |
| 4.6924     | 4.6181          | 6     |
| 4.5603     | 4.5446          | 7     |
| 4.4889     | 4.4844          | 8     |
| 4.4311     | 4.4172          | 9     |
| 4.3759     | 4.3850          | 10    |
| 4.3222     | 4.3224          | 11    |
| 4.2802     | 4.2932          | 12    |
| 4.2507     | 4.2517          | 13    |
| 4.1858     | 4.2192          | 14    |
| 4.1643     | 4.2057          | 15    |
| 4.1406     | 4.2012          | 16    |
| 4.0881     | 4.1809          | 17    |
| 4.0782     | 4.1407          | 18    |
| 4.0536     | 4.1458          | 19    |
| 4.0260     | 4.1167          | 20    |
| 4.0093     | 4.1147          | 21    |
| 3.9739     | 4.0881          | 22    |
| 3.9548     | 4.0896          | 23    |
| 3.9533     | 4.0832          | 24    |
| 3.9363     | 4.0328          | 25    |
| 3.9258     | 4.0340          | 26    |
| 3.8973     | 4.0176          | 27    |
| 3.8789     | 4.0131          | 28    |
| 3.8784     | 4.0032          | 29    |
| 3.8391     | 3.9896          | 30    |
| 3.8506     | 3.9902          | 31    |
| 3.8081     | 3.9742          | 32    |
| 3.8068     | 3.9699          | 33    |
| 3.7911     | 3.9409          | 34    |
| 3.7909     | 3.9411          | 35    |
| 3.7658     | 3.9416          | 36    |
| 3.7317     | 3.9270          | 37    |
| 3.7404     | 3.9225          | 38    |
| 3.7321     | 3.9159          | 39    |
| 3.7112     | 3.9071          | 40    |
| 3.7039     | 3.9003          | 41    |
| 3.6980     | 3.8723          | 42    |
| 3.6639     | 3.8921          | 43    |
| 3.6612     | 3.8674          | 44    |
| 3.6497     | 3.8624          | 45    |
| 3.6284     | 3.8694          | 46    |
| 3.6403     | 3.8701          | 47    |
| 3.5968     | 3.8516          | 48    |
| 3.5749     | 3.8435          | 49    |
| 3.5751     | 3.8545          | 50    |
| 3.5736     | 3.8304          | 51    |
| 3.5722     | 3.8247          | 52    |
| 3.5431     | 3.8396          | 53    |
| 3.5280     | 3.8265          | 54    |
| 3.5288     | 3.8225          | 55    |
| 3.5014     | 3.8248          | 56    |
| 3.5046     | 3.7864          | 57    |
| 3.5144     | 3.8151          | 58    |
| 3.4876     | 3.8117          | 59    |
| 3.4744     | 3.8099          | 60    |
| 3.4667     | 3.8110          | 61    |
| 3.4503     | 3.8165          | 62    |
| 3.4516     | 3.7818          | 63    |
| 3.4484     | 3.8165          | 64    |
| 3.4146     | 3.8282          | 65    |
| 3.3911     | 3.8151          | 66    |
| 3.4345     | 3.7842          | 67    |
| 3.4155     | 3.7777          | 68    |
| 3.3755     | 3.8011          | 69    |
| 3.3595     | 3.7737          | 70    |
| 3.3727     | 3.7744          | 71    |
| 3.3670     | 3.7683          | 72    |
| 3.3493     | 3.7721          | 73    |
| 3.3337     | 3.7927          | 74    |
| 3.3260     | 3.7670          | 75    |
| 3.3160     | 3.7802          | 76    |
| 3.3120     | 3.7885          | 77    |
| 3.3101     | 3.7675          | 78    |
| 3.2842     | 3.7837          | 79    |
| 3.2765     | 3.7607          | 80    |
| 3.2684     | 3.7805          | 81    |
| 3.2576     | 3.7578          | 82    |
| 3.2637     | 3.7661          | 83    |
| 3.2414     | 3.7964          | 84    |
| 3.2241     | 3.7806          | 85    |
| 3.2294     | 3.7762          | 86    |
| 3.2067     | 3.7526          | 87    |
| 3.1882     | 3.7809          | 88    |
| 3.2020     | 3.7670          | 89    |
| 3.1646     | 3.7671          | 90    |
| 3.1873     | 3.7586          | 91    |
| 3.1619     | 3.7843          | 92    |
| 3.1608     | 3.7573          | 93    |
| 3.1648     | 3.7654          | 94    |
| 3.1107     | 3.7811          | 95    |
| 3.1221     | 3.7974          | 96    |
| 3.0947     | 3.7810          | 97    |
| 3.1046     | 3.7647          | 98    |
| 3.0687     | 3.7774          | 99    |


### Framework versions

- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3