首页> 外文会议>International Conference on Machine Learning >Regularization of Neural Networks using DropConnect
【24h】

Regularization of Neural Networks using DropConnect

机译:使用DropConnect进行神经网络的正规化

获取原文

摘要

We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating multiple DropConnect-trained models.
机译:我们引入DropConnect,丢弃的概括(Hinton等,2012),用于在神经网络中正规化大型全连接层。当使用丢帧训练时,随机选择的激活子集被设置为零在每个层内。 ropconnect改为将网络内的随机选择的权重子集设置为零。因此,每个单元从前一层中的单位的随机子集接收输入。我们派生了辍学和DropConnect的泛化性能的界限。然后,我们在一系列数据集上评估DropConnect,与辍学相比,并通过聚合多个DropConnect训练的模型来显示最先进的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号