首页> 外文会议>IEEE International Conference on Machine Learning and Applications >Using Multi-task and Transfer Learning to Solve Working Memory Tasks
【24h】

Using Multi-task and Transfer Learning to Solve Working Memory Tasks

机译:使用多任务和转移学习解决工作记忆任务

获取原文

摘要

We propose a new architecture called Memory-Augmented Encoder-Solver (MAES) that enables transfer learning to solve complex working memory tasks adapted from cognitive psychology. It uses dual recurrent neural network controllers, inside the encoder and solver, respectively, that interface with a shared memory module and is completely differentiable. We study different types of encoders in a systematic manner and demonstrate a unique advantage of multi-task learning in obtaining the best possible encoder. We show by extensive experimentation that the trained MAES models achieve task-size generalization, i.e., they are capable of handling sequential inputs 50 times longer than seen during training, with appropriately large memory modules. We demonstrate that the performance achieved by MAES far outperforms existing and well-known models such as the LSTM, NTM and DNC on the entire suite of tasks.
机译:我们提出了一种新的架构,称为记忆增强编码器求解器(MAES),它使转移学习能够解决适应于认知心理学的复杂工作记忆任务。它使用分别位于编码器和求解器内部的双循环神经网络控制器,该控制器与共享内存模块接口,并且完全可区分。我们以系统的方式研究了不同类型的编码器,并展示了多任务学习在获得最佳编码器方面的独特优势。我们通过广泛的实验表明,训练有素的MAES模型可以实现任务规模的概括,即它们具有适当的大内存模块,能够处理比训练期间长50倍的顺序输入。我们证明,在整个任务集上,MAES所实现的性能远远优于LSTM,NTM和DNC等现有和知名模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号