mGPT-1.3B-persian / README.md
ai-forever's picture
Upload README.md with huggingface_hub
df853f8
|
raw
history blame
3.1 kB
metadata
language:
  - fa
  - en
  - ru
license: mit
tags:
  - gpt3
  - transformers
  - mgpt

🇮🇷 Persian mGPT 1.3B

Language model for Persian. Model has 1.3B parameters as you can guess from it's name.

Persian belongs to Indo-European language family. It's a very poetic language with approximately 110 million speakers. Here are some facts about it:

  1. It is also known as Farsi and is predominantly spoken in Iran, Afghanistan, and Tajikistan.
  2. Persian has a rich literary tradition with iconic poets like Rumi, Hafez, and Ferdowsi.
  3. It uses the Persian script, which is a variant of the Arabic script.

Technical details

It's one of the models derived from the base mGPT-XL (1.3B) model (see the list below) which was originally trained on the 61 languages from 25 language families using Wikipedia and C4 corpus.

We've found additional data for 23 languages most of which are considered as minor and decided to further tune the base model. Persian mGPT 1.3B was trained for another 200 steps with batch_size=4 and context window of 2048 tokens on 1 A100.

Final perplexity for this model on validation is 33.44.

Chart of the training loss and perplexity:

Other mGPT-1.3B models

Feedback

If you'll found a bug of have additional data to train model on your language — please, give us feedback.

Model will be improved over time. Stay tuned!