File size: 809 Bytes
9cf8654
 
 
1c2b6a8
9cf8654
 
2f99e9b
 
80a682e
2f99e9b
091ff14
2f99e9b
 
 
 
 
80a682e
 
091ff14
2f99e9b
80a682e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
language: ti
widget:
- text: "ዓቕሚ ደቂኣንስትዮ [MASK] ብግብሪ ተራእዩ"
---

# BERT Base for Tigrinya Language

We pretrain a BERT base-uncased model for Tigrinya on a dataset of 40 million tokens trained for 40 epochs.

Contained in this repo is the original pretrained Flax model that was trained on a TPU v3.8 and it's corresponding PyTorch version.

## Hyperparameters

The hyperparameters corresponding to model sizes mentioned above are as follows:

| Model Size | L  | AH | HS  | FFN  | P    | Seq  |
|------------|----|----|-----|------|------|------|
| BASE       | 12 | 12 | 768 | 3072 | 110M | 512  |

(L = number of layers; AH = number of attention heads; HS = hidden size; FFN = feedforward network dimension; P = number of parameters; Seq = maximum sequence length.)