Update README.md
Browse files
README.md
CHANGED
@@ -6,9 +6,9 @@ license: apache-2.0
|
|
6 |
GPyT is a GPT2 model trained from scratch (not fine tuned) on Python code from Github. Overall, it was ~200GB of pure
|
7 |
Python code, the current GPyT model is a mere 2 epochs through this data, so it may benefit greatly from continued training and/or fine-tuning.
|
8 |
|
9 |
-
Newlines are replaced by "
|
10 |
|
11 |
-
Input to the model is code, up to the context length of 1024, with newlines replaced by "
|
12 |
|
13 |
Here's a quick example of using this model:
|
14 |
|
|
|
6 |
GPyT is a GPT2 model trained from scratch (not fine tuned) on Python code from Github. Overall, it was ~200GB of pure
|
7 |
Python code, the current GPyT model is a mere 2 epochs through this data, so it may benefit greatly from continued training and/or fine-tuning.
|
8 |
|
9 |
+
Newlines are replaced by "</N>"
|
10 |
|
11 |
+
Input to the model is code, up to the context length of 1024, with newlines replaced by "</N>"
|
12 |
|
13 |
Here's a quick example of using this model:
|
14 |
|