首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Hierarchical Transformers for Multi-Document Summarization
【24h】

Hierarchical Transformers for Multi-Document Summarization

机译:多文件摘要的分层变压器

获取原文

摘要

In this paper, we develop a neural summarization model which can effectively process multiple input documents and distill abstractive summaries. Our model augments a previously proposed Transformer architecture (Liu et al., 2018) with the ability to encode documents in a hierarchical manner. We represent cross-document relationships via an attention mechanism which allows to share information as opposed to simply concatenating text spans and processing them as a flat sequence. Our model learns latent dependencies among textual units, but can also take advantage of explicit graph representations focusing on similarity or discourse relations. Empirical results on the WikiSum dataset demonstrate that the proposed architecture brings substantial improvements over several strong baselines.~1
机译:在本文中,我们开发了一个神经摘要模型,可以有效地处理多个输入文件和蒸馏抽象摘要。我们的模型增强了先前提出的变压器架构(Liu等,2018),其能够以分层方式编码文档。我们通过注意机制代表交叉文档关系,其允许共享信息,而不是简单地连接文本跨度并将其处理为平序列。我们的模型学会了文本单位之间的潜在依赖性,但也可以利用专注于相似性或话语关系的显式图形表示。 Wikisum数据集上的经验结果表明,拟议的架构在几个强大的基线上带来了大量的改进。〜1

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号