...
首页> 外文期刊>Neural computation >Gated Orthogonal Recurrent Units: On Learning to Forget
【24h】

Gated Orthogonal Recurrent Units: On Learning to Forget

机译:门控正交递归单元:关于学习忘记

获取原文
获取原文并翻译 | 示例
           

摘要

We present a novel recurrent neural network (RNN)-based model that combines the remembering ability of unitary evolution RNNs with the ability of gated RNNs to effectively forget redundant or irrelevant information in its memory. We achieve this by extending restricted orthogonal evolution RNNs with a gating mechanism similar to gated recurrent unit RNNs with a reset gate and an update gate. Our model is able to outperform long short-term memory, gated recurrent units, and vanilla unitary or orthogonal RNNs on several long-term-dependency benchmark tasks. We empirically show that both orthogonal and unitary RNNs lack the ability to forget. This ability plays an important role in RNNs. We provide competitive results along with an analysis of our model on many natural sequential tasks, including question answering, speech spectrum prediction, character-level language modeling, and synthetic tasks that involve long-term dependencies such as algorithmic, denoising, and copying tasks.
机译:我们提出了一个新颖的基于循环神经网络(RNN)的模型,该模型结合了单一进化RNN的记忆能力和门控RNN能够有效地忘记其内存中的冗余或无关信息的能力。我们通过使用类似于带有复位门和更新门的门控循环单元RNN的门控机制扩展受限的正交进化RNN来实现此目的。我们的模型能够在几个长期依赖的基准任务上胜过长期的短期记忆,门控循环单元和香草unit或正交RNN。我们凭经验表明,正交和单一RNN都缺乏遗忘的能力。此功能在RNN中扮演重要角色。我们提供了具有竞争力的结果,并在许多自然顺序任务上对模型进行了分析,包括问题回答,语音频谱预测,字符级语言建模以及涉及长期依赖项(例如算法,降噪和复制任务)的合成任务。

著录项

  • 来源
    《Neural computation》 |2019年第4期|765-783|共19页
  • 作者单位

    MIT, 77 Massachusetts Ave, Cambridge, MA 02139 USA;

    Univ Montreal, Montreal, PQ H3T 1J4, Canada;

    MIT, 77 Massachusetts Ave, Cambridge, MA 02139 USA;

    MIT, 77 Massachusetts Ave, Cambridge, MA 02139 USA;

    MIT, 77 Massachusetts Ave, Cambridge, MA 02139 USA;

    MIT, 77 Massachusetts Ave, Cambridge, MA 02139 USA;

    Univ Montreal, Montreal, PQ H3T 1J4, Canada;

  • 收录信息 美国《科学引文索引》(SCI);美国《化学文摘》(CA);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号