首页> 外文会议>International Conference on Computational Linguistics >Learning to Prune Dependency Trees with Rethinking for Neural Relation Extraction
【24h】

Learning to Prune Dependency Trees with Rethinking for Neural Relation Extraction

机译:学习依赖树木,重新思考神经关系提取

获取原文

摘要

Dependency trees have been shown to be effective in capturing long-range relations between target entities. Nevertheless, how to selectively emphasize target-relevant information and remove irrelevant content from the tree is still an open problem. Existing approaches employing predefined rules to eliminate noise may not always yield optimal results due to the complexity and variability of natural language. In this paper, we present a novel architecture named Dynamically Pruned Graph Convolutional Network (DP-GCN), which learns to prune the dependency tree with rethinking in an end-to-end scheme. In each layer of DP-GCN, we employ a selection module to concentrate on nodes expressing the target relation by a set of binary gates and then augment the pruned tree with a pruned semantic graph to ensure the connectivity. After that, we introduce a rethinking mechanism to guide and refine the pruning operation by feeding back the high-level learned features repeatedly. Extensive experimental results demonstrate that our model achieves impressive performance compared to strong competitors.
机译:已显示依赖树在捕获目标实体之间的远程关系方面是有效的。尽管如此,如何选择性地强调目标相关信息并从树中删除无关内容仍然是一个公开问题。由于自然语言的复杂性和可变性,采用预定规则消除噪声的现有方法可能并不总是产生最佳结果。在本文中,我们提出了一种名为动态修剪的图表卷积网络(DP-GCN)的新颖架构,该架构学会在端到端方案中使用重新思考来修剪依赖树。在每层DP-GCN中,我们采用选择模块来专用于一组二进制门表达目标关系的节点,然后用修剪的语义图增强修剪的树以确保连接。之后,我们介绍了一种重新思考机制来指导和优化修剪操作,通过反复送回高级学习功能。广泛的实验结果表明,与强大的竞争对手相比,我们的模型令人印象深刻。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号