【24h】

Abstractive Sentence Summarization with Attentive Recurrent Neural Networks

机译:具有细心复发性神经网络的抽象句概述

获取原文

摘要

Abstractive Sentence Summarization generates a shorter version of a given sentence while attempting to preserve its meaning. We introduce a conditional recurrent neural network (RNN) which generates a summary of an input sentence. The conditioning is provided by a novel convolutional attention-based encoder which ensures that the decoder focuses on the appropriate input words at each step of generation. Our model relies only on learned features and is easy to train in an end-to-end fashion on large data sets. Our experiments show that the model significantly outperforms the recently proposed state-of-the-art method on the Giga-word corpus while performing competitively on the DUC-2004 shared task.
机译:抽象句子摘要在尝试保留其含义时生成更短版本的给定句子。我们介绍了一种有条件的经常性神经网络(RNN),它产生输入句子的摘要。调节由新型卷积注意力的编码器提供,其确保解码器专注于每个生成步骤的适当输入词。我们的模型仅依赖于学习功能,并且在大数据集上易于培训以端到端的方式。我们的实验表明,该模型显着优于最近提出的最近提出的最新方法,同时竞争地在DUC-2004共享任务上竞争地表演。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号