论文标题
Bi-Simcut:一种提高神经机器翻译的简单策略
Bi-SimCut: A Simple Strategy for Boosting Neural Machine Translation
论文作者
论文摘要
我们介绍了双图:一种简单但有效的训练策略,以提高神经机器翻译(NMT)性能。它由两个程序组成:双向预处理和单向填充。这两个过程均使用Simcut,这是一种简单的正则化方法,迫使原始句子对的输出分布之间的一致性。 Without leveraging extra dataset via back-translation or integrating large-scale pretrained model, Bi-SimCut achieves strong translation performance across five translation benchmarks (data sizes range from 160K to 20.2M): BLEU scores of 31.16 for en -> de and 38.37 for de -> en on the IWSLT14 dataset, 30.78 for en -> de and 35.15 for de - >在WMT14数据集上的EN,ZH-> EN在WMT17数据集上为27.17。 Simcut不是一种新方法,而是简化和适用于NMT的cutoff(Shen等,2020)的版本,可以将其视为一种基于扰动的方法。鉴于Simcut和Bi-Simcut的普遍性和简单性,我们认为它们可以作为未来NMT研究的强大基准。
We introduce Bi-SimCut: a simple but effective training strategy to boost neural machine translation (NMT) performance. It consists of two procedures: bidirectional pretraining and unidirectional finetuning. Both procedures utilize SimCut, a simple regularization method that forces the consistency between the output distributions of the original and the cutoff sentence pairs. Without leveraging extra dataset via back-translation or integrating large-scale pretrained model, Bi-SimCut achieves strong translation performance across five translation benchmarks (data sizes range from 160K to 20.2M): BLEU scores of 31.16 for en -> de and 38.37 for de -> en on the IWSLT14 dataset, 30.78 for en -> de and 35.15 for de -> en on the WMT14 dataset, and 27.17 for zh -> en on the WMT17 dataset. SimCut is not a new method, but a version of Cutoff (Shen et al., 2020) simplified and adapted for NMT, and it could be considered as a perturbation-based method. Given the universality and simplicity of SimCut and Bi-SimCut, we believe they can serve as strong baselines for future NMT research.