首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Convolutional Neural Networks With Dynamic Regularization
【24h】

Convolutional Neural Networks With Dynamic Regularization

机译:具有动态正规化的卷积神经网络

获取原文
获取原文并翻译 | 示例
           

摘要

Regularization is commonly used for alleviating overfitting in machine learning. For convolutional neural networks (CNNs), regularization methods, such as DropBlock and Shake-Shake, have illustrated the improvement in the generalization performance. However, these methods lack a self-adaptive ability throughout training. That is, the regularization strength is fixed to a predefined schedule, and manual adjustments are required to adapt to various network architectures. In this article, we propose a dynamic regularization method for CNNs. Specifically, we model the regularization strength as a function of the training loss. According to the change of the training loss, our method can dynamically adjust the regularization strength in the training procedure, thereby balancing the underfitting and overfitting of CNNs. With dynamic regularization, a large-scale model is automatically regularized by the strong perturbation, and vice versa. Experimental results show that the proposed method can improve the generalization capability on off-the-shelf network architectures and outperform state-of-the-art regularization methods.
机译:正则化通常用于减轻机器学习的过度装备。对于卷积神经网络(CNNS),正规化方法,例如丢弃和抖动,已经说明了泛化性能的提高。然而,这些方法在整个训练中都缺乏自适应能力。也就是说,正规化强度固定为预定义的时间表,需要手动调整来适应各种网络架构。在本文中,我们提出了一种用于CNN的动态正则化方法。具体而言,我们以训练损失的函数模拟正则化强度。根据培训损失的变化,我们的方法可以动态调整训练程序中的正则化强度,从而平衡CNN的底层和过度装备。通过动态正规化,大规模模型由强烈的扰动自动规范化,反之亦然。实验结果表明,该方法可以提高现成网络架构上的泛化能力和优于最先进的正则化方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号