pinzhenchen
commited on
Commit
โข
d6683ab
1
Parent(s):
b23a5cd
update README
Browse files
README.md
CHANGED
@@ -23,13 +23,12 @@ This repository contains the translation model for ar-en trained with HPLT data
|
|
23 |
You can also read our deliverable report [here](https://hplt-project.org/HPLT_D5_1___Translation_models_for_select_language_pairs.pdf) for more details.
|
24 |
|
25 |
### Usage
|
26 |
-
*Note* that for quality considerations, we recommend using `HPLT/translate-ar-en-v1.0-hplt_opus` instead of this model.
|
27 |
|
28 |
The model has been trained with Marian. To run inference, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository.
|
29 |
|
30 |
The model can be used with the Hugging Face framework if the weights are converted to the Hugging Face format. We might provide this in the future; contributions are also welcome.
|
31 |
|
32 |
-
|
33 |
## Benchmarks
|
34 |
|
35 |
| testset | BLEU | chrF++ | COMET22 |
|
@@ -41,4 +40,4 @@ The model can be used with the Hugging Face framework if the weights are convert
|
|
41 |
|
42 |
This project has received funding from the European Union's Horizon Europe research and innovation programme under grant agreement No 101070350 and from UK Research and Innovation (UKRI) under the UK government's Horizon Europe funding guarantee [grant number 10052546]
|
43 |
|
44 |
-
Brought to you by researchers from the University of Edinburgh,
|
|
|
23 |
You can also read our deliverable report [here](https://hplt-project.org/HPLT_D5_1___Translation_models_for_select_language_pairs.pdf) for more details.
|
24 |
|
25 |
### Usage
|
26 |
+
*Note* that for quality considerations, we recommend using `[HPLT/translate-ar-en-v1.0-hplt_opus](https://huggingface.co/HPLT/translate-ar-en-v1.0-hplt_opus)` instead of this model.
|
27 |
|
28 |
The model has been trained with Marian. To run inference, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository.
|
29 |
|
30 |
The model can be used with the Hugging Face framework if the weights are converted to the Hugging Face format. We might provide this in the future; contributions are also welcome.
|
31 |
|
|
|
32 |
## Benchmarks
|
33 |
|
34 |
| testset | BLEU | chrF++ | COMET22 |
|
|
|
40 |
|
41 |
This project has received funding from the European Union's Horizon Europe research and innovation programme under grant agreement No 101070350 and from UK Research and Innovation (UKRI) under the UK government's Horizon Europe funding guarantee [grant number 10052546]
|
42 |
|
43 |
+
Brought to you by researchers from the University of Edinburgh, Charles University in Prague, and the whole HPLT consortium.
|