首页> 外文会议>Conference of the European Chapter of the Association for Computational Linguistics >ENPAR: Enhancing Entity and Entity Pair Representations for Joint Entity Relation Extraction
【24h】

ENPAR: Enhancing Entity and Entity Pair Representations for Joint Entity Relation Extraction

机译:ENPAR:增强联合实体关系提取的实体和实体对表示

获取原文

摘要

Current state-of-the-art systems for joint entity relation extraction (Luan et al., 2019; Wad-den et al., 2019) usually adopt the multi-task learning framework. However, annotations for these additional tasks such as coreference resolution and event extraction are always equally hard (or even harder) to obtain. In this work, we propose a pre-training method EnPaR to improve the joint extraction performance. EN-PAR requires only the additional entity annotations that are much easier to collect. Unlike most existing works that only consider incorporating entity information into the sentence encoder, we further utilize the entity pair information. Specifically, we devise four novel objectives, i.e., masked entity typing, masked entity prediction, adversarial context discrimination, and permutation prediction, to pre-train an entity encoder and an entity pair encoder. Comprehensive experiments show that the proposed pre-training method achieves significant improvement over BHRT on ACE05. SciERC, and NYT, and outperforms current state-of-the-art on ACE05.
机译:目前联合实体关系提取的最先进系统(Luan等,2019年; Wad-Den等,2019)通常采用多任务学习框架。但是,这些附加任务的注释如Coreference分辨率和事件提取始终同样硬(甚至更难)。在这项工作中,我们提出了一种预训练方法,以提高联合提取性能。 EN-PAR只需要更容易收集的额外实体注释。与仅考虑将实体信息合并到句子编码器中的大多数现有作品不同,我们进一步利用实体对信息。具体地,我们设计了四种新颖的目标,即掩蔽实体键入,掩蔽实体预测,对手的上下文鉴别和置换预测,以预先训练实体编码器和实体对编码器。综合实验表明,拟议的预训练方法在ACE05上实现了对BHRT的显着改善。 Scierc和NYT,并且优于当前最先进的ACE05。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号