metadata
language: code
thumbnail: https://doesnotexist.codes/messlab.png
tags:
- programming
- gpt2
- causal-lm
license: cc0-1.0
This is a GPT2 774M model trained on the C/C++ code of the top 10,000 most popular packages in Debian, according to the Debian Popularity Contest. The source files were deduplicated using a process similar to the OpenWebText preprocessing (basically a locality-sensitive hash to detect near-duplicates). The model was originally trained using NVIDIA's Megatron-LM but has been converted to Huggingface. Note that the tokenizer is not the standard GPT2 BPE vocab, but one that has been trained for this dataset; the tokenizer is also available from this repository.