论文标题
多语言和任务改编的多语言预培训,用于多语言文本样式转移
Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer
论文作者
论文摘要
我们利用预先训练的SEQ2SEQ模型MBART进行多语言文本样式转移。使用机器翻译的数据以及黄金对齐的英语句子会产生最先进的目标,从而产生我们考虑的三种目标语言。此外,鉴于平行数据的普遍稀缺性,我们提出了一种模块化方法,用于多种语形式转移,该方法由两种针对对语言和任务适应的培训策略组成。我们的方法在没有单语任务特定的并行数据的情况下实现了竞争性能,可以应用于其他样式转移任务以及其他语言。
We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.