Fairseq
Italian
Catalan
AudreyVM commited on
Commit
0bd6ac8
1 Parent(s): de399a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -117,9 +117,10 @@ We use the BLEU score for evaluation on the Flores test set: [Flores-101](https:
117
  Below are the evaluation results on the machine translation from Catalan to Italian compared to [Softcatalà](https://www.softcatala.org/) and [Google Translate](https://translate.google.es/?hl=es):
118
  | Test set | SoftCatalà | Google Translate |mt-aina-it-ca|
119
  |----------------------|------------|------------------|---------------|
120
- | Flores 101 dev | 25,4 | **30,4** | 26,6 |
121
- | Flores 101 devtest |26,6 | **31,2** | 27,2 |
122
- | Average | 26,0 | **30,8** | 29,6 |
 
123
  ## Additional information
124
  ### Author
125
  Language Technologies Unit (LangTech) at the Barcelona Supercomputing Center (langtech@bsc.es)
 
117
  Below are the evaluation results on the machine translation from Catalan to Italian compared to [Softcatalà](https://www.softcatala.org/) and [Google Translate](https://translate.google.es/?hl=es):
118
  | Test set | SoftCatalà | Google Translate |mt-aina-it-ca|
119
  |----------------------|------------|------------------|---------------|
120
+ | Flores 101 dev | 25,4 | **30,4** | 27,5 |
121
+ | Flores 101 devtest |26,6 | **31,2** | 27,7 |
122
+ | NTREX | 29,3 | **33,5** | 30,7 |
123
+ | Average | 27,1 | **31,7** | 28,6 |
124
  ## Additional information
125
  ### Author
126
  Language Technologies Unit (LangTech) at the Barcelona Supercomputing Center (langtech@bsc.es)