thodel commited on
Commit
f45a3d6
1 Parent(s): 372376a

Add contextual information

Browse files
Files changed (1) hide show
  1. README.md +7 -5
README.md CHANGED
@@ -7,6 +7,8 @@ metrics:
7
  model-index:
8
  - name: 22_12_13_luther_blocks_larger_fp16_20ep
9
  results: []
 
 
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -14,22 +16,22 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # 22_12_13_luther_blocks_larger_fp16_20ep
16
 
17
- This model is a fine-tuned version of [stefan-it/german-gpt2-larger](https://huggingface.co/stefan-it/german-gpt2-larger) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
  - Loss: 3.5847
20
  - Accuracy: 0.3168
21
 
22
  ## Model description
23
 
24
- More information needed
25
 
26
  ## Intended uses & limitations
27
 
28
- More information needed
29
 
30
  ## Training and evaluation data
31
 
32
- More information needed
33
 
34
  ## Training procedure
35
 
@@ -70,4 +72,4 @@ The following hyperparameters were used during training:
70
  - Transformers 4.26.0.dev0
71
  - Pytorch 1.13.0
72
  - Datasets 2.7.1
73
- - Tokenizers 0.12.1
 
7
  model-index:
8
  - name: 22_12_13_luther_blocks_larger_fp16_20ep
9
  results: []
10
+ language:
11
+ - de
12
  ---
13
 
14
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
16
 
17
  # 22_12_13_luther_blocks_larger_fp16_20ep
18
 
19
+ This model is a fine-tuned version of [stefan-it/german-gpt2-larger](https://huggingface.co/stefan-it/german-gpt2-larger) on a dataset of texts by Martin Luther.
20
  It achieves the following results on the evaluation set:
21
  - Loss: 3.5847
22
  - Accuracy: 0.3168
23
 
24
  ## Model description
25
 
26
+ This is a language model used to generate wishes for a happy new year to the readers of "reformiert" a journal in Switzerland (https://www.reformiert.info)
27
 
28
  ## Intended uses & limitations
29
 
30
+ This is to test the capabilities of the GPT-2 transformer architecture.
31
 
32
  ## Training and evaluation data
33
 
34
+ Automatic split of an edited and "cleaned" version of parts of Luther's writing. Cleaning refers here to the process of eliminating para-texts like page numbering, footnotes, etc.
35
 
36
  ## Training procedure
37
 
 
72
  - Transformers 4.26.0.dev0
73
  - Pytorch 1.13.0
74
  - Datasets 2.7.1
75
+ - Tokenizers 0.12.1