TristanBehrens
commited on
Commit
•
1e735e8
1
Parent(s):
da2ec8e
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,43 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- gpt2
|
4 |
+
- text-generation
|
5 |
+
- music-modeling
|
6 |
+
- music-generation
|
7 |
+
widget:
|
8 |
+
- text: PIECE_START
|
9 |
+
- text: >-
|
10 |
+
PIECE_START PIECE_START TRACK_START INST=34 DENSITY=8
|
11 |
+
- text: >-
|
12 |
+
PIECE_START TRACK_START INST=1
|
13 |
+
license: apache-2.0
|
14 |
+
---
|
15 |
+
|
16 |
+
|
17 |
+
# GPT-2 for Music
|
18 |
+
|
19 |
+
Language Models such as GPT-2 can be used for Music Generation. The idea is to represent pieces of music as texts, effectively reducing the task to Language Generation.
|
20 |
+
|
21 |
+
This model is a rather small instance of GPT-2 trained the [Lakhclean dataset](https://colinraffel.com/projects/lmd/). The model generates 4 bars at a time at a 16th note resolution with 4/4 meter.
|
22 |
+
|
23 |
+
If you want to contribute, if you want to say hello, if you want to know more, find me on [LinkedIn](https://www.linkedin.com/in/dr-tristan-behrens-734967a2/)
|
24 |
+
|
25 |
+
## Model description
|
26 |
+
|
27 |
+
The model is GPT-2 with 6 decoders and 8 attention-heads each. The context length is 2048. The embedding dimensions are 512 as well.
|
28 |
+
|
29 |
+
## Intended uses & limitations
|
30 |
+
|
31 |
+
This model is just a proof of concept. It shows that HuggingFace can be used to compose music.
|
32 |
+
|
33 |
+
### How to use
|
34 |
+
|
35 |
+
There is a notebook in the repo that you can use to generate symbolic music and then render it.
|
36 |
+
|
37 |
+
### Limitations and bias
|
38 |
+
|
39 |
+
Since this model has been trained on a very small corpus of music, it is overfitting heavily.
|
40 |
+
|
41 |
+
### Acknowledgements
|
42 |
+
|
43 |
+
This model has been created with support from NVIDIA. I am very grateful for the GPU compute they provided!
|