machinelearningzuu commited on
Commit
f9c9ac5
1 Parent(s): 63683e0

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +120 -0
README.md ADDED
@@ -0,0 +1,120 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: t5-small
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: lesson-summarization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # lesson-summarization
15
+
16
+ This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 2.5713
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 2e-05
38
+ - train_batch_size: 1
39
+ - eval_batch_size: 1
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - num_epochs: 200
44
+
45
+ ### Training results
46
+
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:------:|:-----:|:---------------:|
49
+ | 2.9037 | 3.12 | 200 | 2.2456 |
50
+ | 2.5914 | 6.25 | 400 | 2.1498 |
51
+ | 2.393 | 9.38 | 600 | 2.1002 |
52
+ | 2.2409 | 12.5 | 800 | 2.0754 |
53
+ | 2.1515 | 15.62 | 1000 | 2.0683 |
54
+ | 2.0633 | 18.75 | 1200 | 2.0541 |
55
+ | 1.9418 | 21.88 | 1400 | 2.0603 |
56
+ | 1.837 | 25.0 | 1600 | 2.0788 |
57
+ | 1.7715 | 28.12 | 1800 | 2.0754 |
58
+ | 1.6957 | 31.25 | 2000 | 2.0815 |
59
+ | 1.6079 | 34.38 | 2200 | 2.0940 |
60
+ | 1.5947 | 37.5 | 2400 | 2.1094 |
61
+ | 1.4603 | 40.62 | 2600 | 2.1147 |
62
+ | 1.4621 | 43.75 | 2800 | 2.1354 |
63
+ | 1.4021 | 46.88 | 3000 | 2.1519 |
64
+ | 1.3394 | 50.0 | 3200 | 2.1670 |
65
+ | 1.2866 | 53.12 | 3400 | 2.1921 |
66
+ | 1.2681 | 56.25 | 3600 | 2.2045 |
67
+ | 1.1866 | 59.38 | 3800 | 2.2194 |
68
+ | 1.2098 | 62.5 | 4000 | 2.2302 |
69
+ | 1.1386 | 65.62 | 4200 | 2.2400 |
70
+ | 1.0853 | 68.75 | 4400 | 2.2634 |
71
+ | 1.0888 | 71.88 | 4600 | 2.2810 |
72
+ | 1.0408 | 75.0 | 4800 | 2.2909 |
73
+ | 1.0309 | 78.12 | 5000 | 2.3059 |
74
+ | 0.9523 | 81.25 | 5200 | 2.3249 |
75
+ | 0.9671 | 84.38 | 5400 | 2.3333 |
76
+ | 0.9413 | 87.5 | 5600 | 2.3543 |
77
+ | 0.9127 | 90.62 | 5800 | 2.3636 |
78
+ | 0.9095 | 93.75 | 6000 | 2.3676 |
79
+ | 0.8952 | 96.88 | 6200 | 2.3756 |
80
+ | 0.857 | 100.0 | 6400 | 2.3878 |
81
+ | 0.8474 | 103.12 | 6600 | 2.4148 |
82
+ | 0.8215 | 106.25 | 6800 | 2.4231 |
83
+ | 0.8172 | 109.38 | 7000 | 2.4243 |
84
+ | 0.7761 | 112.5 | 7200 | 2.4489 |
85
+ | 0.7737 | 115.62 | 7400 | 2.4718 |
86
+ | 0.7476 | 118.75 | 7600 | 2.4614 |
87
+ | 0.7345 | 121.88 | 7800 | 2.4705 |
88
+ | 0.7426 | 125.0 | 8000 | 2.4740 |
89
+ | 0.7151 | 128.12 | 8200 | 2.4833 |
90
+ | 0.7191 | 131.25 | 8400 | 2.4786 |
91
+ | 0.6818 | 134.38 | 8600 | 2.4882 |
92
+ | 0.6862 | 137.5 | 8800 | 2.4938 |
93
+ | 0.6929 | 140.62 | 9000 | 2.4977 |
94
+ | 0.6494 | 143.75 | 9200 | 2.5195 |
95
+ | 0.6689 | 146.88 | 9400 | 2.5185 |
96
+ | 0.6492 | 150.0 | 9600 | 2.5259 |
97
+ | 0.6384 | 153.12 | 9800 | 2.5259 |
98
+ | 0.6435 | 156.25 | 10000 | 2.5287 |
99
+ | 0.6251 | 159.38 | 10200 | 2.5284 |
100
+ | 0.6295 | 162.5 | 10400 | 2.5398 |
101
+ | 0.6324 | 165.62 | 10600 | 2.5442 |
102
+ | 0.6252 | 168.75 | 10800 | 2.5481 |
103
+ | 0.6108 | 171.88 | 11000 | 2.5455 |
104
+ | 0.6034 | 175.0 | 11200 | 2.5502 |
105
+ | 0.5969 | 178.12 | 11400 | 2.5601 |
106
+ | 0.5949 | 181.25 | 11600 | 2.5617 |
107
+ | 0.6183 | 184.38 | 11800 | 2.5679 |
108
+ | 0.5805 | 187.5 | 12000 | 2.5687 |
109
+ | 0.6032 | 190.62 | 12200 | 2.5708 |
110
+ | 0.5955 | 193.75 | 12400 | 2.5709 |
111
+ | 0.5961 | 196.88 | 12600 | 2.5713 |
112
+ | 0.5914 | 200.0 | 12800 | 2.5713 |
113
+
114
+
115
+ ### Framework versions
116
+
117
+ - Transformers 4.31.0
118
+ - Pytorch 1.13.1
119
+ - Datasets 2.12.0
120
+ - Tokenizers 0.13.3