xsestech commited on
Commit
e55bc82
1 Parent(s): 5e0a610

Model save

Browse files
Files changed (2) hide show
  1. README.md +53 -25
  2. generation_config.json +1 -1
README.md CHANGED
@@ -1,9 +1,11 @@
1
  ---
2
- base_model: ai-forever/ruT5-base
 
3
  tags:
4
  - generated_from_trainer
5
  metrics:
6
  - rouge
 
7
  model-index:
8
  - name: skilltext
9
  results: []
@@ -14,14 +16,15 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # skilltext
16
 
17
- This model is a fine-tuned version of [ai-forever/ruT5-base](https://huggingface.co/ai-forever/ruT5-base) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.0577
20
- - Rouge1: 30.9205
21
- - Rouge2: 11.9258
22
- - Rougel: 26.6497
23
- - Rougelsum: 26.4407
24
- - Gen Len: 18.6875
 
25
 
26
  ## Model description
27
 
@@ -41,30 +44,55 @@ More information needed
41
 
42
  The following hyperparameters were used during training:
43
  - learning_rate: 2e-05
44
- - train_batch_size: 2
45
- - eval_batch_size: 2
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
- - num_epochs: 20
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
55
- |:-------------:|:-------:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
56
- | No log | 1.6129 | 50 | 1.8405 | 16.798 | 2.7473 | 15.7599 | 15.6863 | 19.0 |
57
- | No log | 3.2258 | 100 | 1.4606 | 19.3942 | 7.4911 | 18.9911 | 18.8407 | 18.875 |
58
- | No log | 4.8387 | 150 | 1.3583 | 27.0146 | 13.8805 | 24.3188 | 24.2122 | 18.6875 |
59
- | No log | 6.4516 | 200 | 1.2490 | 32.855 | 15.9819 | 31.0776 | 30.8624 | 18.75 |
60
- | No log | 8.0645 | 250 | 1.1590 | 30.3762 | 11.8253 | 27.5559 | 27.2332 | 18.5625 |
61
- | No log | 9.6774 | 300 | 1.1469 | 37.2275 | 17.107 | 33.4177 | 33.3688 | 18.4375 |
62
- | No log | 11.2903 | 350 | 1.1364 | 34.3596 | 15.6845 | 30.8838 | 31.0842 | 18.625 |
63
- | No log | 12.9032 | 400 | 1.0927 | 34.9322 | 15.8027 | 30.2917 | 30.1379 | 18.6875 |
64
- | No log | 14.5161 | 450 | 1.0672 | 32.2753 | 15.7727 | 28.1883 | 27.8978 | 18.6875 |
65
- | 1.8948 | 16.1290 | 500 | 1.0721 | 37.6573 | 15.6507 | 32.7817 | 32.742 | 18.5625 |
66
- | 1.8948 | 17.7419 | 550 | 1.0692 | 34.958 | 15.3422 | 30.4656 | 30.3306 | 18.5 |
67
- | 1.8948 | 19.3548 | 600 | 1.0577 | 30.9205 | 11.9258 | 26.6497 | 26.4407 | 18.6875 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
68
 
69
 
70
  ### Framework versions
 
1
  ---
2
+ license: apache-2.0
3
+ base_model: google/mt5-base
4
  tags:
5
  - generated_from_trainer
6
  metrics:
7
  - rouge
8
+ - bleu
9
  model-index:
10
  - name: skilltext
11
  results: []
 
16
 
17
  # skilltext
18
 
19
+ This model is a fine-tuned version of [google/mt5-base](https://huggingface.co/google/mt5-base) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: nan
22
+ - Rouge1: 0.431
23
+ - Rouge2: 0.0
24
+ - Rougel: 0.431
25
+ - Rougelsum: 0.431
26
+ - Bleu: 0.0322
27
+ - Gen Len: 11.75
28
 
29
  ## Model description
30
 
 
44
 
45
  The following hyperparameters were used during training:
46
  - learning_rate: 2e-05
47
+ - train_batch_size: 1
48
+ - eval_batch_size: 1
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
+ - num_epochs: 30
53
  - mixed_precision_training: Native AMP
54
 
55
  ### Training results
56
 
57
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Bleu | Gen Len |
58
+ |:-------------:|:-------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:------:|:-------:|
59
+ | No log | 0.8065 | 50 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
60
+ | No log | 1.6129 | 100 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
61
+ | No log | 2.4194 | 150 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
62
+ | No log | 3.2258 | 200 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
63
+ | No log | 4.0323 | 250 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
64
+ | No log | 4.8387 | 300 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
65
+ | No log | 5.6452 | 350 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
66
+ | No log | 6.4516 | 400 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
67
+ | No log | 7.2581 | 450 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
68
+ | 0.0 | 8.0645 | 500 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
69
+ | 0.0 | 8.8710 | 550 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
70
+ | 0.0 | 9.6774 | 600 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
71
+ | 0.0 | 10.4839 | 650 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
72
+ | 0.0 | 11.2903 | 700 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
73
+ | 0.0 | 12.0968 | 750 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
74
+ | 0.0 | 12.9032 | 800 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
75
+ | 0.0 | 13.7097 | 850 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
76
+ | 0.0 | 14.5161 | 900 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
77
+ | 0.0 | 15.3226 | 950 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
78
+ | 0.0 | 16.1290 | 1000 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
79
+ | 0.0 | 16.9355 | 1050 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
80
+ | 0.0 | 17.7419 | 1100 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
81
+ | 0.0 | 18.5484 | 1150 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
82
+ | 0.0 | 19.3548 | 1200 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
83
+ | 0.0 | 20.1613 | 1250 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
84
+ | 0.0 | 20.9677 | 1300 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
85
+ | 0.0 | 21.7742 | 1350 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
86
+ | 0.0 | 22.5806 | 1400 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
87
+ | 0.0 | 23.3871 | 1450 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
88
+ | 0.0 | 24.1935 | 1500 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
89
+ | 0.0 | 25.0 | 1550 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
90
+ | 0.0 | 25.8065 | 1600 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
91
+ | 0.0 | 26.6129 | 1650 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
92
+ | 0.0 | 27.4194 | 1700 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
93
+ | 0.0 | 28.2258 | 1750 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
94
+ | 0.0 | 29.0323 | 1800 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
95
+ | 0.0 | 29.8387 | 1850 | nan | 0.431 | 0.0 | 0.431 | 0.431 | 0.0322 | 11.75 |
96
 
97
 
98
  ### Framework versions
generation_config.json CHANGED
@@ -1,6 +1,6 @@
1
  {
2
  "decoder_start_token_id": 0,
3
- "eos_token_id": 2,
4
  "pad_token_id": 0,
5
  "transformers_version": "4.40.0"
6
  }
 
1
  {
2
  "decoder_start_token_id": 0,
3
+ "eos_token_id": 1,
4
  "pad_token_id": 0,
5
  "transformers_version": "4.40.0"
6
  }