Edit model card

Model Card for Model ID

Distilbert tokenizer trained on KazQAD

Model Details

Model Description

  • Model type: DistilBERT
  • Language(s) (NLP): Kazakh

Training Details

Training Data

https://github.com/IS2AI/KazQAD/

Environmental Impact

  • Hardware Type: TPUv2
  • Hours used: Less than a minute
  • Cloud Provider: Google Colab
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Dataset used to train dappyx/QazDistilbertFast-tokenizer