Training complete
Browse files
README.md
CHANGED
@@ -19,9 +19,9 @@ should probably proofread and complete it, then remove this comment. -->
|
|
19 |
|
20 |
This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on the None dataset.
|
21 |
It achieves the following results on the evaluation set:
|
22 |
-
- Loss: 1.
|
23 |
-
- Bleu:
|
24 |
-
- Gen Len: 6.
|
25 |
|
26 |
## Model description
|
27 |
|
@@ -45,18 +45,45 @@ The following hyperparameters were used during training:
|
|
45 |
- eval_batch_size: 32
|
46 |
- seed: 42
|
47 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
48 |
-
- lr_scheduler_type:
|
49 |
-
- lr_scheduler_warmup_steps:
|
50 |
-
- num_epochs:
|
51 |
- mixed_precision_training: Native AMP
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
-
| Training Loss | Epoch | Step
|
56 |
-
|
57 |
-
| 3.
|
58 |
-
|
|
59 |
-
| 1.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
60 |
|
61 |
|
62 |
### Framework versions
|
|
|
19 |
|
20 |
This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on the None dataset.
|
21 |
It achieves the following results on the evaluation set:
|
22 |
+
- Loss: 1.5734
|
23 |
+
- Bleu: 71.4633
|
24 |
+
- Gen Len: 6.984
|
25 |
|
26 |
## Model description
|
27 |
|
|
|
45 |
- eval_batch_size: 32
|
46 |
- seed: 42
|
47 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
48 |
+
- lr_scheduler_type: cosine
|
49 |
+
- lr_scheduler_warmup_steps: 3000
|
50 |
+
- num_epochs: 30
|
51 |
- mixed_precision_training: Native AMP
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
+
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|
56 |
+
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|
|
57 |
+
| 3.4817 | 1.0 | 749 | 2.2949 | 34.5266 | 7.5496 |
|
58 |
+
| 2.0957 | 2.0 | 1498 | 1.6167 | 53.0797 | 6.8785 |
|
59 |
+
| 1.562 | 3.0 | 2247 | 1.3650 | 60.3083 | 7.0095 |
|
60 |
+
| 1.2877 | 4.0 | 2996 | 1.2149 | 63.8437 | 6.9444 |
|
61 |
+
| 1.1026 | 5.0 | 3745 | 1.1452 | 66.7883 | 7.13 |
|
62 |
+
| 0.9531 | 6.0 | 4494 | 1.1028 | 67.3774 | 6.911 |
|
63 |
+
| 0.8402 | 7.0 | 5243 | 1.0995 | 68.0354 | 6.8114 |
|
64 |
+
| 0.7513 | 8.0 | 5992 | 1.0878 | 69.4876 | 7.0216 |
|
65 |
+
| 0.6746 | 9.0 | 6741 | 1.1109 | 69.7327 | 7.1134 |
|
66 |
+
| 0.6073 | 10.0 | 7490 | 1.1167 | 70.1607 | 7.0526 |
|
67 |
+
| 0.5531 | 11.0 | 8239 | 1.1468 | 69.8006 | 6.8101 |
|
68 |
+
| 0.4981 | 12.0 | 8988 | 1.1856 | 70.5423 | 6.8789 |
|
69 |
+
| 0.4544 | 13.0 | 9737 | 1.2019 | 70.5876 | 6.9313 |
|
70 |
+
| 0.4095 | 14.0 | 10486 | 1.2347 | 70.7996 | 6.8371 |
|
71 |
+
| 0.373 | 15.0 | 11235 | 1.2734 | 71.0903 | 7.0274 |
|
72 |
+
| 0.3408 | 16.0 | 11984 | 1.2974 | 71.104 | 7.0025 |
|
73 |
+
| 0.3096 | 17.0 | 12733 | 1.3313 | 70.7308 | 6.925 |
|
74 |
+
| 0.2856 | 18.0 | 13482 | 1.3820 | 70.9862 | 6.9656 |
|
75 |
+
| 0.2601 | 19.0 | 14231 | 1.4016 | 71.1836 | 7.0082 |
|
76 |
+
| 0.2404 | 20.0 | 14980 | 1.4483 | 71.0219 | 6.9268 |
|
77 |
+
| 0.2241 | 21.0 | 15729 | 1.4714 | 71.2721 | 6.9552 |
|
78 |
+
| 0.2065 | 22.0 | 16478 | 1.4814 | 71.3874 | 6.9968 |
|
79 |
+
| 0.1942 | 23.0 | 17227 | 1.5090 | 71.4722 | 6.9404 |
|
80 |
+
| 0.1831 | 24.0 | 17976 | 1.5265 | 71.4556 | 6.9771 |
|
81 |
+
| 0.173 | 25.0 | 18725 | 1.5379 | 71.4026 | 6.9998 |
|
82 |
+
| 0.1662 | 26.0 | 19474 | 1.5530 | 71.4843 | 6.9932 |
|
83 |
+
| 0.159 | 27.0 | 20223 | 1.5668 | 71.3663 | 6.9784 |
|
84 |
+
| 0.1568 | 28.0 | 20972 | 1.5742 | 71.3261 | 6.9734 |
|
85 |
+
| 0.1566 | 29.0 | 21721 | 1.5739 | 71.4435 | 6.9843 |
|
86 |
+
| 0.156 | 30.0 | 22470 | 1.5734 | 71.4633 | 6.984 |
|
87 |
|
88 |
|
89 |
### Framework versions
|
runs/Aug30_03-54-49_30190107fae3/events.out.tfevents.1724990090.30190107fae3.3467.1
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:08e3449c914bbca6181c72363ad7dddf990e7e87c0c3154e04922d9a5831f3e4
|
3 |
+
size 23848
|