首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Augmenting Recurrent Neural Networks Resilience by Dropout
【24h】

Augmenting Recurrent Neural Networks Resilience by Dropout

机译:通过辍学增强递归神经网络的弹性

获取原文
获取原文并翻译 | 示例
           

摘要

This brief discusses the simple idea that dropout regularization can be used to efficiently induce resiliency to missing inputs at prediction time in a generic neural network. We show how the approach can be effective on tasks where imputation strategies often fail, namely, involving recurrent neural networks and scenarios where whole sequences of input observations are missing. The experimental analysis provides an assessment of the accuracy-resiliency tradeoff in multiple recurrent models, including reservoir computing methods, and comprising real-world ambient intelligence and biomedical time series.
机译:本文简要讨论了一个简单的想法,即可以使用掉落正则化来在通用神经网络中的预测时有效地诱导对丢失输入的弹性。我们展示了该方法如何在插补策略经常失败的任务上有效,即涉及递归神经网络和缺少输入观察的整个序列的场景。实验分析提供了对多个循环模型(包括储层计算方法)的准确性-弹性权衡的评估,包括真实世界的环境情报和生物医学时间序列。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号