native Arabian Pretrained GPT-2 models with different sizes (0.1B, 0.3B, 0.8B) trained on 20B+ Arabic tokens