首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Attention Is (not) All You Need for Commonsense Reasoning
【24h】

Attention Is (not) All You Need for Commonsense Reasoning

机译:关注(不是)即使勤义推理所需的一切

获取原文

摘要

The recently introduced BERT model exhibits strong performance on several language understanding benchmarks. In this paper, we describe a simple re-implementation of BERT for commonsense reasoning. We show that the attentions produced by BERT can be directly utilized for tasks such as the Pronoun Disambiguation Problem and Winograd Schema Challenge. Our proposed attention-guided commonsense reasoning method is conceptually simple yet empirically powerful. Experimental analysis on multiple datasets demonstrates that our proposed system performs remarkably well on all cases while outperforming the previously reported state of the art by a margin. While results suggest that BERT seems to implicitly learn to establish complex relationships between entities, solving commonsense reasoning tasks might require more than unsupervised models learned from huge text corpora.
机译:最近介绍的BERT模型对几种语言理解基准表现出强烈表现。在本文中,我们描述了对偶联推理的伯特的简单重新实现。我们表明BERT产生的注意可以直接用于代词消歧问题,如代词消歧问题和Winograd Schema挑战。我们提出的注意力引导的致辞推理方法在概念上简单但经验强大。多个数据集的实验分析表明,我们所提出的系统在所有情况下表现出显着良好,同时优于先前报告的艺术状态。虽然结果表明,BERT似乎隐含地学会建立实体之间的复杂关系,解决了致辞推理任务可能需要超过巨大的文本语料库中学到的无监督模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号