Update README.md
Browse files
README.md
CHANGED
@@ -1,5 +1,7 @@
|
|
1 |
# Logion: Machine Learning for Greek Philology
|
2 |
|
|
|
|
|
3 |
The most advanced Ancient Greek BERT model trained to date! Read the paper on [arxiv](https://arxiv.org/abs/2305.01099) by Charlie Cowen-Breen, Creston Brooks, Johannes Haubold, and Barbara Graziosi.
|
4 |
|
5 |
Originally based on the pre-trained weights and tokenizer made available by Pranaydeep Singh's [Ancient Greek BERT](https://huggingface.co/pranaydeeps/Ancient-Greek-BERT), we train on a corpus of over 70 million words of premodern Greek.
|
|
|
1 |
# Logion: Machine Learning for Greek Philology
|
2 |
|
3 |
+
(NOTE!! A MORE RECENT MODEL IS AVAILABLE HERE: https://huggingface.co/cabrooks/LOGION-50k_wordpiece)
|
4 |
+
|
5 |
The most advanced Ancient Greek BERT model trained to date! Read the paper on [arxiv](https://arxiv.org/abs/2305.01099) by Charlie Cowen-Breen, Creston Brooks, Johannes Haubold, and Barbara Graziosi.
|
6 |
|
7 |
Originally based on the pre-trained weights and tokenizer made available by Pranaydeep Singh's [Ancient Greek BERT](https://huggingface.co/pranaydeeps/Ancient-Greek-BERT), we train on a corpus of over 70 million words of premodern Greek.
|