首页> 外文会议>International conference on artificial neural networks >Dynamic Cortex Memory: Enhancing Recurrent Neural Networks for Gradient-Based Sequence Learning
【24h】

Dynamic Cortex Memory: Enhancing Recurrent Neural Networks for Gradient-Based Sequence Learning

机译:动态皮质记忆:增强递归神经网络,用于基于梯度的序列学习

获取原文

摘要

In this paper a novel recurrent neural network (RNN) model for gradient-based sequence learning is introduced. The presented dynamic cortex memory (DCM) is an extension of the well-known long short term memory (LSTM) model. The main innovation of the DCM is the enhancement of the inner interplay of the gates and the error carousel due to several new and trainable connections. These connections enable a direct signal transfer from the gates to one another. With this novel enhancement the networks are able to converge faster during training with back-propagation through time (BPTT) than LSTM under the same training conditions. Furthermore, DCMs yield better generalization results than LSTMs. This behaviour is shown for different supervised problem scenarios, including storing precise values, adding and learning a context-sensitive grammar.
机译:本文介绍了一种用于基于梯度的序列学习的新型递归神经网络(RNN)模型。提出的动态皮质记忆(DCM)是众所周知的长期短期记忆(LSTM)模型的扩展。 DCM的主要创新是由于几个新的可训练的连接而增强了闸门的内部相互作用和错误传送带。这些连接使从门到另一门的直接信号传输成为可能。通过这种新颖的增强功能,在相同的训练条件下,与LSTM相比,网络在训练过程中通过时间反向传播(BPTT)能够更快地收敛。此外,DCM比LSTM产生更好的泛化结果。针对不同的有监督问题场景显示了此行为,包括存储精确值,添加和学习上下文相关语法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号