Edit model card

This is a MicroBERT model for Maltese.

  • Its suffix is -m, which means that it was pretrained using supervision from masked language modeling.
  • The unlabeled Maltese data was taken from a February 2022 dump of Maltese Wikipedia, totaling 2,113,223 tokens.
  • The UD treebank UD_Maltese-GSD, v2.9, totaling 44,162 tokens, was used for labeled data.

Please see the repository and the paper for more details.

Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.