|
### opus-mt-en-de |
|
|
|
* source languages: en |
|
* target languages: de |
|
* OPUS readme: [en-de](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/en-de/README.md) |
|
|
|
* dataset: opus |
|
* model: transformer-align |
|
* pre-processing: normalization + SentencePiece |
|
* download original weights: [opus-2020-02-26.zip](https://object.pouta.csc.fi/OPUS-MT-models/en-de/opus-2020-02-26.zip) |
|
* test set translations: [opus-2020-02-26.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-de/opus-2020-02-26.test.txt) |
|
* test set scores: [opus-2020-02-26.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-de/opus-2020-02-26.eval.txt) |
|
|
|
## Benchmarks |
|
|
|
| testset | BLEU | chr-F | |
|
|-----------------------|-------|-------| |
|
| newssyscomb2009.en.de | 23.5 | 0.540 | |
|
| news-test2008.en.de | 23.5 | 0.529 | |
|
| newstest2009.en.de | 22.3 | 0.530 | |
|
| newstest2010.en.de | 24.9 | 0.544 | |
|
| newstest2011.en.de | 22.5 | 0.524 | |
|
| newstest2012.en.de | 23.0 | 0.525 | |
|
| newstest2013.en.de | 26.9 | 0.553 | |
|
| newstest2015-ende.en.de | 31.1 | 0.594 | |
|
| newstest2016-ende.en.de | 37.0 | 0.636 | |
|
| newstest2017-ende.en.de | 29.9 | 0.586 | |
|
| newstest2018-ende.en.de | 45.2 | 0.690 | |
|
| newstest2019-ende.en.de | 40.9 | 0.654 | |
|
| Tatoeba.en.de | 47.3 | 0.664 | |
|
|
|
|