论文标题

YSEOP在Semeval-2020任务5:反事实语句分析的级联BERT语言模型

Yseop at SemEval-2020 Task 5: Cascaded BERT Language Model for Counterfactual Statement Analysis

论文作者

Akl, Hanna Abi, Mariko, Dominique, Labidurie, Estelle

论文摘要

在本文中,我们探讨了检测和评估反事实句子的策略。我们描述了我们的Semeval-2020任务系统5:用语言建模因果推理:检测反事实。我们使用BERT基本模型进行分类任务,并构建混合Bert多层感知器系统来处理序列标识任务。我们的实验表明,在引入句法和语义特征的同时,在改进分类任务中的系统方面几乎没有什么作用,但使用这些类型的功能作为级联的线性输入来微调模型的序列划分能力,可确保其表现优于其他类似的Polpose复杂系统,例如Bilstm-Crf,例如Bilstm-Crf。我们的系统在任务1的F1分数为85.00%,任务2中的F1得分为83.90%。

In this paper, we explore strategies to detect and evaluate counterfactual sentences. We describe our system for SemEval-2020 Task 5: Modeling Causal Reasoning in Language: Detecting Counterfactuals. We use a BERT base model for the classification task and build a hybrid BERT Multi-Layer Perceptron system to handle the sequence identification task. Our experiments show that while introducing syntactic and semantic features does little in improving the system in the classification task, using these types of features as cascaded linear inputs to fine-tune the sequence-delimiting ability of the model ensures it outperforms other similar-purpose complex systems like BiLSTM-CRF in the second task. Our system achieves an F1 score of 85.00% in Task 1 and 83.90% in Task 2.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源