Update README.md
Browse files
README.md
CHANGED
@@ -17,7 +17,9 @@ widget:
|
|
17 |
|
18 |
aristoBERTo is a pre-trained model for ancient Greek, a low resource language. We initialized the pre-training with weights from [GreekBERT](https://huggingface.co/nlpaueb/bert-base-greek-uncased-v1), a Greek version of BERT pre-trained on a large corpus of modern Greek (~ 30 GB of texts). We continued the pre-training with an ancient Greek corpus of about 900 MB, which was scrapped from the web and post-processed. Duplicate texts and editorial punctuation were removed.
|
19 |
|
20 |
-
Applied to the processing of ancient Greek, aristoBERTo outperforms xlm-roberta-base and
|
|
|
|
|
21 |
|
22 |
|
23 |
## Intended uses
|
|
|
17 |
|
18 |
aristoBERTo is a pre-trained model for ancient Greek, a low resource language. We initialized the pre-training with weights from [GreekBERT](https://huggingface.co/nlpaueb/bert-base-greek-uncased-v1), a Greek version of BERT pre-trained on a large corpus of modern Greek (~ 30 GB of texts). We continued the pre-training with an ancient Greek corpus of about 900 MB, which was scrapped from the web and post-processed. Duplicate texts and editorial punctuation were removed.
|
19 |
|
20 |
+
Applied to the processing of ancient Greek, aristoBERTo outperforms xlm-roberta-base and mdeberta in most downstream tasks like the labeling of POS, MORPH, DEP and LEMMA.
|
21 |
+
|
22 |
+
aristoBERTo is provided by the Diogenet project of the University of California, San Diego.
|
23 |
|
24 |
|
25 |
## Intended uses
|