Edit model card

bert-base-bg-cs-pl-ru-cased

SlavicBERT[1] (Slavic (bg, cs, pl, ru), cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) was trained on Russian News and four Wikipedias: Bulgarian, Czech, Polish, and Russian. Subtoken vocabulary was built using this data. Multilingual BERT was used as an initialization for SlavicBERT.

08.11.2021: upload model with MLM and NSP heads

[1]: Arkhipov M., Trofimova M., Kuratov Y., Sorokin A. (2019). Tuning Multilingual Transformers for Language-Specific Named Entity Recognition. ACL anthology W19-3712.

Downloads last month
394
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for DeepPavlov/bert-base-bg-cs-pl-ru-cased

Finetunes
7 models