fabianmmueller
commited on
Commit
•
c4fb37e
1
Parent(s):
7f2879d
Update README.md
Browse files
README.md
CHANGED
@@ -12,15 +12,24 @@ should probably proofread and complete it, then remove this comment. -->
|
|
12 |
|
13 |
# deep-haiku-gpt-2
|
14 |
|
15 |
-
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the
|
16 |
|
17 |
## Model description
|
18 |
|
19 |
-
|
|
|
|
|
20 |
|
21 |
## Intended uses & limitations
|
22 |
|
23 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
## Training and evaluation data
|
26 |
|
|
|
12 |
|
13 |
# deep-haiku-gpt-2
|
14 |
|
15 |
+
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the [haiku](https://huggingface.co/datasets/statworx/haiku) dataset.
|
16 |
|
17 |
## Model description
|
18 |
|
19 |
+
The model is a fine-tuned version of GPT-2 for generation of [Haikus](https://en.wikipedia.org/wiki/Haiku). The model, data and training procedure is inspired by a [blog post by Robert A. Gonsalves](https://towardsdatascience.com/deep-haiku-teaching-gpt-j-to-compose-with-syllable-patterns-5234bca9701). Instead of using a 8bit version of GPT-J 6B, we instead used vanilla GPT-2. From what we saw, the model performance comparable but is much easier to fine-tune.
|
20 |
+
|
21 |
+
We used the same multitask training approach as in der post, but significantly extended the dataset (almost double the size of the original on). A prepared version of the dataset can be found [here](https://huggingface.co/datasets/statworx/haiku).
|
22 |
|
23 |
## Intended uses & limitations
|
24 |
|
25 |
+
The model is intended to generate Haikus. To do so, it was trained using a multitask learning approach (see [Caruana 1997](http://www.cs.cornell.edu/~caruana/mlj97.pdf)) with the following four different tasks: :
|
26 |
+
|
27 |
+
- topic2graphemes `(keywords = text)`
|
28 |
+
- topic2phonemes `<keyword_phonemes = text_phonemes>`
|
29 |
+
- graphemes2pphonemes `[text = text_phonemes]`
|
30 |
+
- phonemes2graphemes `{text_phonemes = text}`
|
31 |
+
|
32 |
+
To use the model, use an appropriate prompt like `(dog rain =` and let the model generate a Haiku given the keyword.
|
33 |
|
34 |
## Training and evaluation data
|
35 |
|