musabg commited on
Commit
6a384ec
1 Parent(s): 94ec8b7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -23
README.md CHANGED
@@ -20,33 +20,15 @@ model-index:
20
  - name: Rouge1
21
  type: rouge
22
  value: 56.4468
 
 
23
  ---
24
 
25
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
26
- should probably proofread and complete it, then remove this comment. -->
27
-
28
- # mt5-xl-tr-summarization
29
 
30
  This model is a fine-tuned version of [google/mt5-xl](https://huggingface.co/google/mt5-xl) on the musabg/wikipedia-tr-summarization dataset.
31
- It achieves the following results on the evaluation set:
32
- - Loss: 0.5676
33
- - Rouge1: 56.4468
34
- - Rouge2: 41.3258
35
- - Rougel: 48.1909
36
- - Rougelsum: 48.4284
37
- - Gen Len: 75.9265
38
-
39
- ## Model description
40
-
41
- More information needed
42
-
43
- ## Intended uses & limitations
44
 
45
- More information needed
46
-
47
- ## Training and evaluation data
48
-
49
- More information needed
50
 
51
  ## Training procedure
52
 
@@ -63,6 +45,16 @@ The following hyperparameters were used during training:
63
  - lr_scheduler_type: linear
64
  - num_epochs: 1.0
65
 
 
 
 
 
 
 
 
 
 
 
66
  ### Training results
67
 
68
 
@@ -72,4 +64,4 @@ The following hyperparameters were used during training:
72
  - Transformers 4.31.0.dev0
73
  - Pytorch 1.13.1
74
  - Datasets 2.12.0
75
- - Tokenizers 0.13.3
 
20
  - name: Rouge1
21
  type: rouge
22
  value: 56.4468
23
+ language:
24
+ - tr
25
  ---
26
 
27
+ # mT5-Xl Turkish Summarization
 
 
 
28
 
29
  This model is a fine-tuned version of [google/mt5-xl](https://huggingface.co/google/mt5-xl) on the musabg/wikipedia-tr-summarization dataset.
 
 
 
 
 
 
 
 
 
 
 
 
 
30
 
31
+ This can be used with HF summarization pipeline.
 
 
 
 
32
 
33
  ## Training procedure
34
 
 
45
  - lr_scheduler_type: linear
46
  - num_epochs: 1.0
47
 
48
+ ### Eval results
49
+
50
+ It achieves the following results on the evaluation set:
51
+ - Loss: 0.5676
52
+ - Rouge1: 56.4468
53
+ - Rouge2: 41.3258
54
+ - Rougel: 48.1909
55
+ - Rougelsum: 48.4284
56
+ - Gen Len: 75.9265
57
+
58
  ### Training results
59
 
60
 
 
64
  - Transformers 4.31.0.dev0
65
  - Pytorch 1.13.1
66
  - Datasets 2.12.0
67
+ - Tokenizers 0.13.3