Edit model card

bedus-creation/t5-small-dataset-i-lim-to-eng

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.1159
  • Validation Loss: 0.1001
  • Epoch: 58

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
1.0149 0.2846 0
0.3652 0.2595 1
0.3009 0.2475 2
0.2726 0.2319 3
0.2486 0.2168 4
0.2459 0.2118 5
0.2301 0.2228 6
0.2342 0.2148 7
0.2199 0.1870 8
0.2221 0.1810 9
0.2199 0.1815 10
0.2066 0.1737 11
0.1996 0.1665 12
0.1935 0.1735 13
0.1992 0.1622 14
0.1841 0.1583 15
0.1909 0.1560 16
0.1784 0.1554 17
0.1816 0.1493 18
0.1765 0.1477 19
0.1805 0.1543 20
0.1763 0.1474 21
0.1730 0.1422 22
0.1651 0.1445 23
0.1707 0.1403 24
0.1631 0.1401 25
0.1653 0.1420 26
0.1640 0.1383 27
0.1571 0.1295 28
0.1632 0.1332 29
0.1586 0.1303 30
0.1534 0.1251 31
0.1572 0.1213 32
0.1561 0.1207 33
0.1514 0.1186 34
0.1522 0.1194 35
0.1455 0.1159 36
0.1466 0.1160 37
0.1394 0.1148 38
0.1421 0.1136 39
0.1492 0.1144 40
0.1444 0.1113 41
0.1393 0.1125 42
0.1304 0.1119 43
0.1394 0.1068 44
0.1312 0.1064 45
0.1367 0.1043 46
0.1278 0.1028 47
0.1308 0.1033 48
0.1266 0.1008 49
0.1127 0.1035 50
0.1263 0.0956 51
0.1281 0.0933 52
0.1209 0.0966 53
0.1139 0.0937 54
0.1174 0.0942 55
0.1137 0.0933 56
0.1183 0.0960 57
0.1159 0.1001 58

Framework versions

  • Transformers 4.33.2
  • TensorFlow 2.13.0
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bedus-creation/t5-small-dataset-i-lim-to-eng

Base model

google-t5/t5-small
Finetuned
(1510)
this model
Finetunes
1 model