pijarcandra22/t5Jawa2Indo
This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.9572
- Validation Loss: 1.1659
- Epoch: 299
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Validation Loss | Epoch |
---|---|---|
3.8958 | 3.3598 | 0 |
3.4684 | 3.0863 | 1 |
3.2505 | 2.9092 | 2 |
3.0952 | 2.7813 | 3 |
2.9749 | 2.6834 | 4 |
2.8813 | 2.6016 | 5 |
2.8008 | 2.5321 | 6 |
2.7323 | 2.4726 | 7 |
2.6741 | 2.4187 | 8 |
2.6219 | 2.3724 | 9 |
2.5735 | 2.3279 | 10 |
2.5324 | 2.2918 | 11 |
2.4934 | 2.2575 | 12 |
2.4570 | 2.2271 | 13 |
2.4214 | 2.1950 | 14 |
2.3906 | 2.1661 | 15 |
2.3628 | 2.1396 | 16 |
2.3341 | 2.1168 | 17 |
2.3097 | 2.0924 | 18 |
2.2824 | 2.0717 | 19 |
2.2592 | 2.0504 | 20 |
2.2377 | 2.0338 | 21 |
2.2139 | 2.0142 | 22 |
2.1953 | 1.9946 | 23 |
2.1751 | 1.9793 | 24 |
2.1572 | 1.9625 | 25 |
2.1375 | 1.9471 | 26 |
2.1208 | 1.9300 | 27 |
2.1063 | 1.9190 | 28 |
2.0866 | 1.9050 | 29 |
2.0748 | 1.8916 | 30 |
2.0568 | 1.8809 | 31 |
2.0418 | 1.8682 | 32 |
2.0274 | 1.8551 | 33 |
2.0139 | 1.8468 | 34 |
2.0026 | 1.8347 | 35 |
1.9880 | 1.8248 | 36 |
1.9746 | 1.8128 | 37 |
1.9608 | 1.8056 | 38 |
1.9524 | 1.7968 | 39 |
1.9414 | 1.7840 | 40 |
1.9269 | 1.7764 | 41 |
1.9160 | 1.7662 | 42 |
1.9041 | 1.7602 | 43 |
1.8962 | 1.7503 | 44 |
1.8826 | 1.7414 | 45 |
1.8737 | 1.7359 | 46 |
1.8635 | 1.7273 | 47 |
1.8544 | 1.7207 | 48 |
1.8476 | 1.7135 | 49 |
1.8355 | 1.7051 | 50 |
1.8272 | 1.6969 | 51 |
1.8178 | 1.6906 | 52 |
1.8079 | 1.6862 | 53 |
1.7998 | 1.6786 | 54 |
1.7939 | 1.6712 | 55 |
1.7826 | 1.6628 | 56 |
1.7752 | 1.6567 | 57 |
1.7675 | 1.6518 | 58 |
1.7606 | 1.6464 | 59 |
1.7510 | 1.6408 | 60 |
1.7456 | 1.6329 | 61 |
1.7390 | 1.6284 | 62 |
1.7289 | 1.6233 | 63 |
1.7183 | 1.6176 | 64 |
1.7127 | 1.6125 | 65 |
1.7087 | 1.6098 | 66 |
1.6990 | 1.5985 | 67 |
1.6945 | 1.5934 | 68 |
1.6872 | 1.5876 | 69 |
1.6795 | 1.5816 | 70 |
1.6758 | 1.5778 | 71 |
1.6659 | 1.5742 | 72 |
1.6603 | 1.5702 | 73 |
1.6516 | 1.5618 | 74 |
1.6463 | 1.5592 | 75 |
1.6400 | 1.5541 | 76 |
1.6354 | 1.5484 | 77 |
1.6305 | 1.5424 | 78 |
1.6217 | 1.5378 | 79 |
1.6169 | 1.5338 | 80 |
1.6102 | 1.5301 | 81 |
1.6070 | 1.5229 | 82 |
1.5979 | 1.5195 | 83 |
1.5926 | 1.5163 | 84 |
1.5875 | 1.5106 | 85 |
1.5814 | 1.5075 | 86 |
1.5748 | 1.5021 | 87 |
1.5672 | 1.4984 | 88 |
1.5657 | 1.4945 | 89 |
1.5597 | 1.4913 | 90 |
1.5530 | 1.4863 | 91 |
1.5506 | 1.4821 | 92 |
1.5437 | 1.4785 | 93 |
1.5405 | 1.4730 | 94 |
1.5325 | 1.4678 | 95 |
1.5285 | 1.4666 | 96 |
1.5233 | 1.4634 | 97 |
1.5189 | 1.4580 | 98 |
1.5122 | 1.4558 | 99 |
1.5078 | 1.4517 | 100 |
1.5059 | 1.4471 | 101 |
1.4956 | 1.4446 | 102 |
1.4944 | 1.4396 | 103 |
1.4881 | 1.4371 | 104 |
1.4851 | 1.4334 | 105 |
1.4763 | 1.4295 | 106 |
1.4725 | 1.4273 | 107 |
1.4686 | 1.4243 | 108 |
1.4663 | 1.4196 | 109 |
1.4588 | 1.4180 | 110 |
1.4558 | 1.4152 | 111 |
1.4525 | 1.4127 | 112 |
1.4465 | 1.4085 | 113 |
1.4431 | 1.4052 | 114 |
1.4386 | 1.4025 | 115 |
1.4343 | 1.4000 | 116 |
1.4306 | 1.3969 | 117 |
1.4259 | 1.3925 | 118 |
1.4192 | 1.3919 | 119 |
1.4165 | 1.3886 | 120 |
1.4109 | 1.3857 | 121 |
1.4093 | 1.3844 | 122 |
1.4058 | 1.3797 | 123 |
1.4003 | 1.3779 | 124 |
1.3992 | 1.3733 | 125 |
1.3898 | 1.3721 | 126 |
1.3877 | 1.3692 | 127 |
1.3845 | 1.3681 | 128 |
1.3821 | 1.3665 | 129 |
1.3767 | 1.3652 | 130 |
1.3720 | 1.3600 | 131 |
1.3707 | 1.3572 | 132 |
1.3674 | 1.3546 | 133 |
1.3628 | 1.3550 | 134 |
1.3582 | 1.3510 | 135 |
1.3548 | 1.3484 | 136 |
1.3518 | 1.3481 | 137 |
1.3490 | 1.3467 | 138 |
1.3463 | 1.3423 | 139 |
1.3411 | 1.3401 | 140 |
1.3367 | 1.3387 | 141 |
1.3332 | 1.3371 | 142 |
1.3313 | 1.3341 | 143 |
1.3285 | 1.3304 | 144 |
1.3235 | 1.3302 | 145 |
1.3203 | 1.3292 | 146 |
1.3186 | 1.3259 | 147 |
1.3132 | 1.3230 | 148 |
1.3106 | 1.3233 | 149 |
1.3083 | 1.3169 | 150 |
1.3011 | 1.3179 | 151 |
1.2986 | 1.3151 | 152 |
1.2975 | 1.3150 | 153 |
1.2905 | 1.3124 | 154 |
1.2887 | 1.3096 | 155 |
1.2862 | 1.3105 | 156 |
1.2831 | 1.3064 | 157 |
1.2796 | 1.3051 | 158 |
1.2777 | 1.3024 | 159 |
1.2758 | 1.2993 | 160 |
1.2694 | 1.2997 | 161 |
1.2681 | 1.2974 | 162 |
1.2626 | 1.2935 | 163 |
1.2617 | 1.2946 | 164 |
1.2592 | 1.2928 | 165 |
1.2562 | 1.2899 | 166 |
1.2520 | 1.2890 | 167 |
1.2488 | 1.2876 | 168 |
1.2468 | 1.2848 | 169 |
1.2450 | 1.2840 | 170 |
1.2388 | 1.2861 | 171 |
1.2384 | 1.2815 | 172 |
1.2331 | 1.2808 | 173 |
1.2328 | 1.2774 | 174 |
1.2299 | 1.2770 | 175 |
1.2253 | 1.2752 | 176 |
1.2251 | 1.2740 | 177 |
1.2188 | 1.2722 | 178 |
1.2167 | 1.2706 | 179 |
1.2141 | 1.2679 | 180 |
1.2125 | 1.2671 | 181 |
1.2080 | 1.2674 | 182 |
1.2049 | 1.2665 | 183 |
1.2021 | 1.2635 | 184 |
1.2013 | 1.2629 | 185 |
1.1975 | 1.2599 | 186 |
1.1946 | 1.2593 | 187 |
1.1939 | 1.2599 | 188 |
1.1897 | 1.2560 | 189 |
1.1879 | 1.2569 | 190 |
1.1841 | 1.2539 | 191 |
1.1829 | 1.2540 | 192 |
1.1804 | 1.2538 | 193 |
1.1759 | 1.2513 | 194 |
1.1745 | 1.2480 | 195 |
1.1690 | 1.2483 | 196 |
1.1686 | 1.2458 | 197 |
1.1647 | 1.2450 | 198 |
1.1628 | 1.2457 | 199 |
1.1624 | 1.2461 | 200 |
1.1584 | 1.2429 | 201 |
1.1563 | 1.2417 | 202 |
1.1543 | 1.2407 | 203 |
1.1489 | 1.2391 | 204 |
1.1464 | 1.2422 | 205 |
1.1482 | 1.2384 | 206 |
1.1446 | 1.2355 | 207 |
1.1425 | 1.2351 | 208 |
1.1373 | 1.2343 | 209 |
1.1378 | 1.2327 | 210 |
1.1362 | 1.2311 | 211 |
1.1331 | 1.2304 | 212 |
1.1315 | 1.2279 | 213 |
1.1265 | 1.2290 | 214 |
1.1254 | 1.2284 | 215 |
1.1220 | 1.2276 | 216 |
1.1208 | 1.2230 | 217 |
1.1218 | 1.2220 | 218 |
1.1140 | 1.2222 | 219 |
1.1115 | 1.2205 | 220 |
1.1120 | 1.2223 | 221 |
1.1081 | 1.2213 | 222 |
1.1059 | 1.2190 | 223 |
1.1025 | 1.2186 | 224 |
1.1031 | 1.2182 | 225 |
1.0996 | 1.2155 | 226 |
1.0972 | 1.2144 | 227 |
1.0953 | 1.2136 | 228 |
1.0929 | 1.2126 | 229 |
1.0893 | 1.2153 | 230 |
1.0868 | 1.2147 | 231 |
1.0877 | 1.2114 | 232 |
1.0834 | 1.2118 | 233 |
1.0815 | 1.2103 | 234 |
1.0802 | 1.2096 | 235 |
1.0771 | 1.2110 | 236 |
1.0740 | 1.2087 | 237 |
1.0735 | 1.2058 | 238 |
1.0731 | 1.2077 | 239 |
1.0693 | 1.2051 | 240 |
1.0667 | 1.2055 | 241 |
1.0662 | 1.2034 | 242 |
1.0659 | 1.2028 | 243 |
1.0619 | 1.2009 | 244 |
1.0601 | 1.2020 | 245 |
1.0578 | 1.1984 | 246 |
1.0541 | 1.2002 | 247 |
1.0524 | 1.1992 | 248 |
1.0474 | 1.1996 | 249 |
1.0493 | 1.1975 | 250 |
1.0466 | 1.1986 | 251 |
1.0454 | 1.1955 | 252 |
1.0448 | 1.1940 | 253 |
1.0388 | 1.1944 | 254 |
1.0373 | 1.1930 | 255 |
1.0345 | 1.1956 | 256 |
1.0330 | 1.1915 | 257 |
1.0329 | 1.1902 | 258 |
1.0310 | 1.1923 | 259 |
1.0277 | 1.1905 | 260 |
1.0282 | 1.1890 | 261 |
1.0229 | 1.1895 | 262 |
1.0225 | 1.1888 | 263 |
1.0227 | 1.1877 | 264 |
1.0207 | 1.1845 | 265 |
1.0165 | 1.1870 | 266 |
1.0143 | 1.1850 | 267 |
1.0133 | 1.1838 | 268 |
1.0107 | 1.1851 | 269 |
1.0097 | 1.1852 | 270 |
1.0082 | 1.1829 | 271 |
1.0050 | 1.1824 | 272 |
1.0032 | 1.1834 | 273 |
1.0017 | 1.1806 | 274 |
1.0017 | 1.1805 | 275 |
0.9989 | 1.1814 | 276 |
0.9985 | 1.1779 | 277 |
0.9947 | 1.1782 | 278 |
0.9940 | 1.1776 | 279 |
0.9921 | 1.1779 | 280 |
0.9909 | 1.1788 | 281 |
0.9876 | 1.1764 | 282 |
0.9867 | 1.1763 | 283 |
0.9832 | 1.1762 | 284 |
0.9795 | 1.1743 | 285 |
0.9791 | 1.1762 | 286 |
0.9772 | 1.1724 | 287 |
0.9770 | 1.1729 | 288 |
0.9754 | 1.1757 | 289 |
0.9730 | 1.1711 | 290 |
0.9707 | 1.1734 | 291 |
0.9700 | 1.1732 | 292 |
0.9683 | 1.1699 | 293 |
0.9653 | 1.1705 | 294 |
0.9660 | 1.1706 | 295 |
0.9626 | 1.1679 | 296 |
0.9625 | 1.1666 | 297 |
0.9592 | 1.1693 | 298 |
0.9572 | 1.1659 | 299 |
Framework versions
- Transformers 4.35.2
- TensorFlow 2.14.0
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for pijarcandra22/t5Jawa2Indo
Base model
google-t5/t5-small