【24h】

Top-down Tree Long Short-Term Memory Networks

机译:自上而下的树长短期内存网络

获取原文

摘要

Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks. In this paper we develop Tree Long Short-Term Memory (TreeLSTM), a neural network model based on LSTM, which is designed to predict a tree rather than a linear sequence. TreeLSTM defines the probability of a sentence by estimating the generation probability of its dependency tree. At each time step, a node is generated based on the representation of the generated subtree. We further enhance the modeling power of TreeLSTM by explicitly representing the correlations between left and right dependents. Application of our model to the MSR sentence completion challenge achieves results beyond the current state of the art. We also report results on dependency parsing reranking achieving competitive performance.
机译:长短期内存(LSTM)网络,一种具有更复杂的计算单元的经常性神经网络,已成功应用于各种序列建模任务。在本文中,我们开发了基于LSTM的神经网络模型的树长短期记忆(TreelSTM),其旨在预测树而不是线性序列。 TREELSTM通过估计其依赖性树的产生概率来定义句子的概率。在每个时间步骤中,基于生成子树的表示生成节点。通过明确表示左右依赖性之间的相关性,我们进一步提高了触手的建模力。我们的模型在MSR句型挑战中的应用达到了现有最新状态的结果。我们还报告了依赖解析重新登录竞争性能的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号