【24h】

Abstractive Sentence Summarization with Attentive Recurrent Neural Networks

机译:细心递归神经网络的抽象句摘要

获取原文

摘要

Abstractive Sentence Summarization generates a shorter version of a given sentence while attempting to preserve its meaning. We introduce a conditional recurrent neural network (RNN) which generates a summary of an input sentence. The conditioning is provided by a novel convolutional attention-based encoder which ensures that the decoder focuses on the appropriate input words at each step of generation. Our model relies only on learned features and is easy to train in an end-to-end fashion on large data sets. Our experiments show that the model significantly outperforms the recently proposed state-of-the-art method on the Giga-word corpus while performing competitively on the DUC-2004 shared task.
机译:抽象句摘要在尝试保留给定句子的含义时会生成给定句子的简短版本。我们引入了条件递归神经网络(RNN),该条件生成输入句子的摘要。该条件由新颖的基于卷积注意的编码器提供,该编码器可确保解码器在生成的每个步骤中都专注于适当的输入字。我们的模型仅依赖于已学习的功能,并且易于在大型数据集上以端到端的方式进行训练。我们的实验表明,该模型在处理DUC-2004共享任务时表现出明显优于最近提出的有关Giga-word语料库的最新方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号