Update README.md
Browse files
README.md
CHANGED
@@ -3,10 +3,7 @@
|
|
3 |
|
4 |
<img src="https://huggingface.co/UBC-NLP/AraT5-base/resolve/main/AraT5_CR_new.png" alt="AraT5" width="45%" height="35%" align="right"/>
|
5 |
|
6 |
-
This is the repository accompanying our paper [AraT5: Text-to-Text Transformers for Arabic Language Understanding and Generation](https://arxiv.org/abs/2109.12068). In this is the repository we
|
7 |
-
* Introduce **AraT5<sub>MSA</sub>**, **AraT5<sub>Tweet</sub>**, and **AraT5**: three powerful Arabic-specific text-to-text Transformer based models;
|
8 |
-
* Introduce **ARGEN**: A new benchmark for Arabic language generation and evaluation for four Arabic NLP tasks, namely, ```machine translation```, ```summarization```, ```news title generation```, ```question generation```, , ```paraphrasing```, ```transliteration```, and ```code-switched translation```.
|
9 |
-
* Evaluate ```AraT5``` models on ```ARGEN``` and compare against available language models.
|
10 |
|
11 |
|
12 |
---
|
@@ -67,4 +64,3 @@ If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-s
|
|
67 |
|
68 |
## Acknowledgments
|
69 |
We gratefully acknowledge support from the Natural Sciences and Engineering Research Council of Canada, the Social Sciences and Humanities Research Council of Canada, Canadian Foundation for Innovation, [ComputeCanada](www.computecanada.ca) and [UBC ARC-Sockeye](https://doi.org/10.14288/SOCKEYE). We also thank the [Google TensorFlow Research Cloud (TFRC)](https://www.tensorflow.org/tfrc) program for providing us with free TPU access.
|
70 |
-
(TFRC)](https://www.tensorflow.org/tfrc) program for providing us with free TPU access.
|
|
|
3 |
|
4 |
<img src="https://huggingface.co/UBC-NLP/AraT5-base/resolve/main/AraT5_CR_new.png" alt="AraT5" width="45%" height="35%" align="right"/>
|
5 |
|
6 |
+
This is the repository accompanying our paper [AraT5: Text-to-Text Transformers for Arabic Language Understanding and Generation](https://arxiv.org/abs/2109.12068). In this is the repository we Introduce **AraT5<sub>MSA</sub>**, **AraT5<sub>Tweet</sub>**, and **AraT5**: three powerful Arabic-specific text-to-text Transformer based models;
|
|
|
|
|
|
|
7 |
|
8 |
|
9 |
---
|
|
|
64 |
|
65 |
## Acknowledgments
|
66 |
We gratefully acknowledge support from the Natural Sciences and Engineering Research Council of Canada, the Social Sciences and Humanities Research Council of Canada, Canadian Foundation for Innovation, [ComputeCanada](www.computecanada.ca) and [UBC ARC-Sockeye](https://doi.org/10.14288/SOCKEYE). We also thank the [Google TensorFlow Research Cloud (TFRC)](https://www.tensorflow.org/tfrc) program for providing us with free TPU access.
|
|