bhavitvyamalik commited on
Commit
824b874
1 Parent(s): e05931d

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +4 -7
README.md CHANGED
@@ -2,9 +2,6 @@
2
  tags:
3
  - translation
4
  license: cc-by-4.0
5
- language:
6
- - en
7
- - zh
8
  ---
9
 
10
  ### Translation model for en-zh_hant OPUS_HPLT v1.0
@@ -16,14 +13,14 @@ This repository contains the model weights for translation models trained with M
16
  * Dataset: All of OPUS including HPLT
17
  * Model: transformer-base
18
  * Tokenizer: SentencePiece (Unigram)
19
- * Cleaning: We use OpusCleaner for cleaning the corpus. Details about rules used can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/en-zh_hant/raw/v2)
20
 
21
  To run inference with Marian, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository.
22
 
23
 
24
  ## Benchmarks
25
 
26
- | testset | BLEU | chr-F | comet |
27
  | -------------------------------------- | ---- | ----- | ----- |
28
- | flores200 | 23.4 | 18.5 | 0.8364 |
29
- | ntrex | 19.6 | 20.4 | 0.7934 |
 
2
  tags:
3
  - translation
4
  license: cc-by-4.0
 
 
 
5
  ---
6
 
7
  ### Translation model for en-zh_hant OPUS_HPLT v1.0
 
13
  * Dataset: All of OPUS including HPLT
14
  * Model: transformer-base
15
  * Tokenizer: SentencePiece (Unigram)
16
+ * Cleaning: We use OpusCleaner for cleaning the corpus. Details about rules used can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/zh_hant-en/raw/v2)
17
 
18
  To run inference with Marian, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository.
19
 
20
 
21
  ## Benchmarks
22
 
23
+ | testset | BLEU | chr-F | COMET-22 |
24
  | -------------------------------------- | ---- | ----- | ----- |
25
+ | flores200 | 23.4 | 18.5 | 0.7896 |
26
+ | ntrex | 19.6 | 20.4 | 0.735 |