论文标题

递归非自动回调图形变压器,用于依赖性解析,并进行迭代精致

Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement

论文作者

Mohammadshahi, Alireza, Henderson, James

论文摘要

我们通过递归应用非AutoreSpressive Graph to Graph Troph Transformer的递归应用,将递归的非自动向上图形变压器架构(RNGTR)提出,并将其应用于句法依赖性分析。我们使用用BERT预先训练的改进模型来证明RNGTR对几个依赖性语料库的功能和有效性。我们还介绍了句法变压器(Syntr),这是一种类似于我们的改进模型的非恢复解析器。 RNGTR可以提高来自通用依赖性Treebanks,英语和中文宾夕法尼亚州立Treebanks和德国Conll2009语料库的13种语言的各种初始解析器的准确性,甚至可以改善Syntr获得的新的最先进的结果,从而显着改善了所有公司测试的您的现状。

We propose the Recursive Non-autoregressive Graph-to-Graph Transformer architecture (RNGTr) for the iterative refinement of arbitrary graphs through the recursive application of a non-autoregressive Graph-to-Graph Transformer and apply it to syntactic dependency parsing. We demonstrate the power and effectiveness of RNGTr on several dependency corpora, using a refinement model pre-trained with BERT. We also introduce Syntactic Transformer (SynTr), a non-recursive parser similar to our refinement model. RNGTr can improve the accuracy of a variety of initial parsers on 13 languages from the Universal Dependencies Treebanks, English and Chinese Penn Treebanks, and the German CoNLL2009 corpus, even improving over the new state-of-the-art results achieved by SynTr, significantly improving the state-of-the-art for all corpora tested.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源