Update README.md
Browse files
README.md
CHANGED
@@ -1,8 +1,7 @@
|
|
1 |
# Logion: Machine Learning for Greek Philology
|
|
|
2 |
|
3 |
-
(
|
4 |
-
|
5 |
-
The most advanced Ancient Greek BERT model trained to date! Read the paper on [arxiv](https://arxiv.org/abs/2305.01099) by Charlie Cowen-Breen, Creston Brooks, Johannes Haubold, and Barbara Graziosi.
|
6 |
|
7 |
Originally based on the pre-trained weights and tokenizer made available by Pranaydeep Singh's [Ancient Greek BERT](https://huggingface.co/pranaydeeps/Ancient-Greek-BERT), we train on a corpus of over 70 million words of premodern Greek.
|
8 |
|
|
|
1 |
# Logion: Machine Learning for Greek Philology
|
2 |
+
# (for the most recent model, see: https://huggingface.co/cabrooks/LOGION-50k_wordpiece)
|
3 |
|
4 |
+
Read the paper on [arxiv](https://arxiv.org/abs/2305.01099) by Charlie Cowen-Breen, Creston Brooks, Johannes Haubold, and Barbara Graziosi.
|
|
|
|
|
5 |
|
6 |
Originally based on the pre-trained weights and tokenizer made available by Pranaydeep Singh's [Ancient Greek BERT](https://huggingface.co/pranaydeeps/Ancient-Greek-BERT), we train on a corpus of over 70 million words of premodern Greek.
|
7 |
|