论文标题

审计的语言编码器是自然的标记框架

Pretrained Language Encoders are Natural Tagging Frameworks for Aspect Sentiment Triplet Extraction

论文作者

Gou, Yanjie, Lei, Yinjie, Liu, Lingqiao, Dai, Yong, Shen, Chunxu, Tong, Yongqi

论文摘要

方面情感三胞胎提取(ASTE)旨在提取方面,意见及其情感关系的跨度。现有作品通常将跨度检测作为1D令牌标记问题提出,并使用令牌对的2D标记矩阵对情感识别进行建模。此外,通过利用像伯特(Bert)这样的验证语言编码器(PLES)的代表形式,它们可以实现更好的性能。但是,他们只是利用将功能提取器作为提取器来构建其模块,但从未深入了解特定知识所包含的内容。在本文中,我们认为,与其进一步设计模块以捕获ASTE的电感偏置,不如包含“足够”的“足够”功能,可用于1D和2D标记:(1)令牌表示包含令牌本身的上下文含义,因此此级别的特征带有1D标记的必要信息。 (2)不同PLE层的注意力矩阵可以进一步捕获令牌对中存在的多层次语言知识,从而使2D标记受益。 (3)此外,对于简单的转换,这两个功能也可以很容易地转换为2D标记矩阵和1D标记序列。这将进一步提高标记结果。通过这样做,PLE可以是自然的标记框架并实现新的最新状态,通过广泛的实验和深入分析可以验证。

Aspect Sentiment Triplet Extraction (ASTE) aims to extract the spans of aspect, opinion, and their sentiment relations as sentiment triplets. Existing works usually formulate the span detection as a 1D token tagging problem, and model the sentiment recognition with a 2D tagging matrix of token pairs. Moreover, by leveraging the token representation of Pretrained Language Encoders (PLEs) like BERT, they can achieve better performance. However, they simply leverage PLEs as feature extractors to build their modules but never have a deep look at what specific knowledge does PLEs contain. In this paper, we argue that instead of further designing modules to capture the inductive bias of ASTE, PLEs themselves contain "enough" features for 1D and 2D tagging: (1) The token representation contains the contextualized meaning of token itself, so this level feature carries necessary information for 1D tagging. (2) The attention matrix of different PLE layers can further capture multi-level linguistic knowledge existing in token pairs, which benefits 2D tagging. (3) Furthermore, with simple transformations, these two features can also be easily converted to the 2D tagging matrix and 1D tagging sequence, respectively. That will further boost the tagging results. By doing so, PLEs can be natural tagging frameworks and achieve a new state of the art, which is verified by extensive experiments and deep analyses.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源