【24h】

Top-down Tree Long Short-Term Memory Networks

机译:自上而下的树长短期内存网络

获取原文

摘要

Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks. In this paper we develop Tree Long Short-Term Memory (TreeLSTM), a neural network model based on LSTM, which is designed to predict a tree rather than a linear sequence. TreeLSTM defines the probability of a sentence by estimating the generation probability of its dependency tree. At each time step, a node is generated based on the representation of the generated subtree. We further enhance the modeling power of TreeLSTM by explicitly representing the correlations between left and right dependents. Application of our model to the MSR sentence completion challenge achieves results beyond the current state of the art. We also report results on dependency parsing reranking achieving competitive performance.
机译:长短期记忆(LSTM)网络是一种具有更复杂的计算单元的递归神经网络,已成功应用于各种序列建模任务。在本文中,我们开发了树长短期记忆(TreeLSTM),这是一种基于LSTM的神经网络模型,旨在预测树而不是线性序列。 TreeLSTM通过估计句子依赖树的生成概率来定义句子的概率。在每个时间步骤,都会根据所生成子树的表示来生成节点。通过显式表示左右依存关系之间的相关性,我们进一步增强了TreeLSTM的建模能力。将我们的模型应用于MSR句子完成挑战可以获得超出当前技术水平的结果。我们还将报告有关依存关系分析的结果,从而重新排名以获得竞争绩效。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号