...
首页> 外文期刊>Evolutionary computation >Neuroevolution of a Modular Memory-Augmented Neural Network for Deep Memory Problems
【24h】

Neuroevolution of a Modular Memory-Augmented Neural Network for Deep Memory Problems

机译:深度记忆问题的模块化记忆增强神经网络的神经进化

获取原文
获取原文并翻译 | 示例
           

摘要

We present Modular Memory Units (MMUs), a new class of memory-augmented neural network. MMU builds on the gated neural architecture of Gated Recurrent Units (GRUs) and Long Short Term Memory (LSTMs), to incorporate an external memory block, similar to a Neural Turing Machine (NTM). MMU interacts with the memory block using independent read and write gates that serve to decouple the memory from the central feedforward operation. This allows for regimented memory access and update, giving our network the ability to choose when to read from memory, update it, or simply ignore it. This capacity to act in detachment allows the network to shield the memory from noise and other distractions, while simultaneously using it to effectively retain and propagate information over an extended period of time. We train MMU using both neuroevolution and gradient descent, and perform experiments on two deep memory benchmarks. Results demonstrate that MMU performs significantly faster and more accurately than traditional LSTM-based methods, and is robust to dramatic increases in the sequence depth of these memory benchmarks.
机译:我们提出了模块化内存单元(MMU),这是一类新的内存增强神经网络。 MMU建立在门控循环单元(GRU)和长期短期记忆(LSTM)的门控神经架构的基础上,并结合了类似于神经图灵机(NTM)的外部存储块。 MMU使用独立的读和写门与存储模块进行交互,这些门用于将内存与中央前馈操作分离。这允许进行有规则的内存访问和更新,使我们的网络能够选择何时从内存中读取,更新或只是忽略它。这种可分离的功能使网络能够保护内存免受噪声和其他干扰,同时又可以长时间有效地保留和传播信息。我们使用神经进化和梯度下降训练MMU,并在两个深度记忆基准上进行实验。结果表明,MMU比传统的基于LSTM的方法执行得更快,更准确,并且对于显着提高这些内存基准的序列深度具有鲁棒性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号