saileshaman commited on
Commit
15ce174
1 Parent(s): 3a31ac5

End of training

Browse files
README.md ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: t5-small
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - rouge
8
+ model-index:
9
+ - name: t5-small-finetuned-dialogsum-v3
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # t5-small-finetuned-dialogsum-v3
17
+
18
+ This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.2045
21
+ - Rouge1: 38.3615
22
+ - Rouge2: 16.0241
23
+ - Rougel: 32.901
24
+ - Rougelsum: 34.8687
25
+ - Gen Len: 18.892
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 16
46
+ - eval_batch_size: 16
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - num_epochs: 30
51
+ - mixed_precision_training: Native AMP
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
57
+ | 1.7344 | 1.0 | 779 | 1.4251 | 33.4125 | 10.7502 | 28.0588 | 30.0903 | 18.858 |
58
+ | 1.4975 | 2.0 | 1558 | 1.3623 | 34.4069 | 11.9728 | 29.0576 | 31.156 | 18.874 |
59
+ | 1.4621 | 3.0 | 2337 | 1.3355 | 34.9786 | 12.314 | 29.4869 | 31.4407 | 18.86 |
60
+ | 1.4149 | 4.0 | 3116 | 1.3119 | 35.5881 | 12.9123 | 30.1883 | 32.0652 | 18.874 |
61
+ | 1.4009 | 5.0 | 3895 | 1.2905 | 36.3104 | 13.8382 | 30.893 | 32.7095 | 18.882 |
62
+ | 1.3709 | 6.0 | 4674 | 1.2736 | 36.3456 | 13.8426 | 30.7526 | 32.6784 | 18.906 |
63
+ | 1.3589 | 7.0 | 5453 | 1.2671 | 36.6543 | 14.2334 | 30.98 | 32.9241 | 18.892 |
64
+ | 1.3373 | 8.0 | 6232 | 1.2557 | 37.2264 | 14.7072 | 31.413 | 33.2844 | 18.914 |
65
+ | 1.3168 | 9.0 | 7011 | 1.2520 | 37.315 | 14.8744 | 31.6711 | 33.4863 | 18.862 |
66
+ | 1.3044 | 10.0 | 7790 | 1.2454 | 37.8787 | 15.4762 | 32.3244 | 34.107 | 18.886 |
67
+ | 1.2915 | 11.0 | 8569 | 1.2380 | 38.0242 | 15.5379 | 32.4465 | 34.292 | 18.862 |
68
+ | 1.2926 | 12.0 | 9348 | 1.2362 | 37.82 | 15.4074 | 32.0479 | 33.9622 | 18.882 |
69
+ | 1.2818 | 13.0 | 10127 | 1.2318 | 38.2168 | 16.0879 | 32.592 | 34.5757 | 18.892 |
70
+ | 1.2766 | 14.0 | 10906 | 1.2257 | 38.559 | 16.2997 | 32.9956 | 34.9149 | 18.864 |
71
+ | 1.2666 | 15.0 | 11685 | 1.2245 | 38.1764 | 15.9612 | 32.525 | 34.6476 | 18.878 |
72
+ | 1.2602 | 16.0 | 12464 | 1.2191 | 38.3852 | 16.085 | 32.809 | 34.7302 | 18.884 |
73
+ | 1.2523 | 17.0 | 13243 | 1.2164 | 38.426 | 16.1149 | 32.6806 | 34.7701 | 18.894 |
74
+ | 1.2466 | 18.0 | 14022 | 1.2142 | 38.6658 | 16.0599 | 32.9194 | 34.905 | 18.89 |
75
+ | 1.2332 | 19.0 | 14801 | 1.2152 | 38.4253 | 15.9033 | 32.7993 | 34.8635 | 18.896 |
76
+ | 1.2344 | 20.0 | 15580 | 1.2093 | 38.6261 | 16.0519 | 33.1192 | 34.9215 | 18.918 |
77
+ | 1.2278 | 21.0 | 16359 | 1.2091 | 38.6618 | 16.2012 | 33.134 | 35.0842 | 18.904 |
78
+ | 1.2255 | 22.0 | 17138 | 1.2077 | 38.6482 | 16.142 | 33.0472 | 35.037 | 18.906 |
79
+ | 1.2305 | 23.0 | 17917 | 1.2068 | 38.6584 | 16.1184 | 32.9757 | 34.9885 | 18.89 |
80
+ | 1.2275 | 24.0 | 18696 | 1.2069 | 38.3795 | 16.0471 | 32.9456 | 34.8267 | 18.874 |
81
+ | 1.2227 | 25.0 | 19475 | 1.2064 | 38.4788 | 16.1603 | 33.0022 | 34.8844 | 18.87 |
82
+ | 1.218 | 26.0 | 20254 | 1.2051 | 38.5133 | 16.0813 | 33.0334 | 34.9492 | 18.89 |
83
+ | 1.2183 | 27.0 | 21033 | 1.2046 | 38.3323 | 15.839 | 32.7421 | 34.7147 | 18.884 |
84
+ | 1.2195 | 28.0 | 21812 | 1.2040 | 38.3573 | 16.0328 | 32.86 | 34.8107 | 18.892 |
85
+ | 1.2145 | 29.0 | 22591 | 1.2045 | 38.3932 | 16.1115 | 32.9154 | 34.8664 | 18.894 |
86
+ | 1.212 | 30.0 | 23370 | 1.2045 | 38.3615 | 16.0241 | 32.901 | 34.8687 | 18.892 |
87
+
88
+
89
+ ### Framework versions
90
+
91
+ - Transformers 4.35.0
92
+ - Pytorch 2.1.0+cu118
93
+ - Datasets 2.14.6
94
+ - Tokenizers 0.14.1
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "decoder_start_token_id": 0,
3
+ "eos_token_id": 1,
4
+ "pad_token_id": 0,
5
+ "transformers_version": "4.35.0"
6
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9232427ef1d21aa15fa19c3cf3cd2ff7801733fbc459c2fc1792ad56824c9ae8
3
  size 242041896
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2e4a0ee01a87f86e43ad1fcc5f79c17c4813c473225e05d3c35f7f473396a2dd
3
  size 242041896
runs/Nov11_15-25-37_11ae802949b5/events.out.tfevents.1699716343.11ae802949b5.1016.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:28b0fed32cbfd945db6dc11f00b4820f971c02197ee2fb16f974aaaa9b1121d7
3
- size 27807
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:12bf7c96d3e817c355d0fcc8d3c1a261df53b1fba201cbf226afed009a042a29
3
+ size 28702