首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Hierarchical Transformers for Multi-Document Summarization
【24h】

Hierarchical Transformers for Multi-Document Summarization

机译:用于多文档摘要的分层变压器

获取原文

摘要

In this paper, we develop a neural summarization model which can effectively process multiple input documents and distill abstractive summaries. Our model augments a previously proposed Transformer architecture (Liu et al., 2018) with the ability to encode documents in a hierarchical manner. We represent cross-document relationships via an attention mechanism which allows to share information as opposed to simply concatenating text spans and processing them as a flat sequence. Our model learns latent dependencies among textual units, but can also take advantage of explicit graph representations focusing on similarity or discourse relations. Empirical results on the WikiSum dataset demonstrate that the proposed architecture brings substantial improvements over several strong baselines.~1
机译:在本文中,我们开发了一种神经摘要模型,该模型可以有效地处理多个输入文档并提取抽象摘要。我们的模型增强了以前提出的Transformer架构(Liu等人,2018),并具有以分层方式对文档进行编码的能力。我们通过注意机制表示跨文档的关系,该机制允许共享信息,而不是简单地将文本范围连接起来并将其作为平面序列进行处理。我们的模型学习文本单元之间的潜在依赖关系,但也可以利用专注于相似性或话语关系的显式图表示。 WikiSum数据集上的经验结果表明,所提出的体系结构在几个强大的基准上带来了实质性的改进。〜1

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号