Broomva commited on
Commit
b7d6477
1 Parent(s): 874274b

End of training

Browse files
Files changed (2) hide show
  1. README.md +17 -2
  2. generation_config.json +0 -1
README.md CHANGED
@@ -3,6 +3,8 @@ license: apache-2.0
3
  base_model: t5-large
4
  tags:
5
  - generated_from_trainer
 
 
6
  model-index:
7
  - name: t5-large-translation-spa-guc
8
  results: []
@@ -14,6 +16,10 @@ should probably proofread and complete it, then remove this comment. -->
14
  # t5-large-translation-spa-guc
15
 
16
  This model is a fine-tuned version of [t5-large](https://huggingface.co/t5-large) on an unknown dataset.
 
 
 
 
17
 
18
  ## Model description
19
 
@@ -33,14 +39,23 @@ More information needed
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 2e-05
36
- - train_batch_size: 16
37
- - eval_batch_size: 16
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: linear
41
  - lr_scheduler_warmup_steps: 10
42
  - num_epochs: 3
43
 
 
 
 
 
 
 
 
 
 
44
  ### Framework versions
45
 
46
  - Transformers 4.35.2
 
3
  base_model: t5-large
4
  tags:
5
  - generated_from_trainer
6
+ metrics:
7
+ - bleu
8
  model-index:
9
  - name: t5-large-translation-spa-guc
10
  results: []
 
16
  # t5-large-translation-spa-guc
17
 
18
  This model is a fine-tuned version of [t5-large](https://huggingface.co/t5-large) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.0832
21
+ - Bleu: 0.952
22
+ - Gen Len: 17.9397
23
 
24
  ## Model description
25
 
 
39
 
40
  The following hyperparameters were used during training:
41
  - learning_rate: 2e-05
42
+ - train_batch_size: 8
43
+ - eval_batch_size: 8
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
  - lr_scheduler_warmup_steps: 10
48
  - num_epochs: 3
49
 
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
53
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|:-------:|
54
+ | 1.3886 | 1.0 | 12889 | 1.2435 | 0.7737 | 17.9241 |
55
+ | 1.3043 | 2.0 | 25778 | 1.1197 | 0.9071 | 17.9235 |
56
+ | 1.2024 | 3.0 | 38667 | 1.0832 | 0.952 | 17.9397 |
57
+
58
+
59
  ### Framework versions
60
 
61
  - Transformers 4.35.2
generation_config.json CHANGED
@@ -1,5 +1,4 @@
1
  {
2
- "_from_model_config": true,
3
  "decoder_start_token_id": 0,
4
  "eos_token_id": 1,
5
  "pad_token_id": 0,
 
1
  {
 
2
  "decoder_start_token_id": 0,
3
  "eos_token_id": 1,
4
  "pad_token_id": 0,