language: | |
- en | |
- sw | |
tags: | |
- translation | |
license: cc-by-4.0 | |
### Translation model for en-sw HPLT v1.0 | |
This repository contains the model weights for translation models trained with Marian for HPLT project. For usage instructions, evaluation scripts, and inference scripts, please refer to the [HPLT-MT-Models v1.0](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0) GitHub repository. | |
* Source language: en | |
* Target language: sw | |
* Dataset: HPLT only | |
* Model: transformer-base | |
* Tokenizer: SentencePiece (Unigram) | |
* Cleaning: We use OpusCleaner for cleaning the corpus. Details about rules used can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/en-sw/raw/v0) | |
To run inference with Marian, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository. | |
## Benchmarks | |
| testset | BLEU | chr-F | comet | | |
| -------------------------------------- | ---- | ----- | ----- | | |
| flores200 | 28.4 | 54.6 | 0.8316 | | |
| ntrex | 30.5 | 55.2 | 0.8221 | | |