【24h】

A novel softplus linear unit for deep convolutional neural networks

机译:深度卷积神经网络的一种新型软多电机线单元

获取原文
获取原文并翻译 | 示例
           

摘要

Current improvements in the performance of deep neural networks are partly due to the proposition of rectified linear units. A ReLU activation function outputs zero for negative component, inducing the death of some neurons and a bias shift of the outputs, which causes oscillations and impedes learning. According to the theory that "zero mean activations improve learning ability", a softplus linear unit (SLU) is proposed as an adaptive activation function that can speed up learning and improve performance in deep convolutional neural networks. Firstly, for the reduction of the bias shift, negative inputs are processed using the softplus function, and a general form of the SLU function is proposed. Secondly, the parameters of the positive component are fixed to control vanishing gradients. Thirdly, the rules for updating the parameters of the negative component are established to meet back- propagation requirements. Finally, we designed deep auto-encoder networks and conducted several experiments with them on the MNIST dataset for unsupervised learning. For supervised learning, we designed deep convolutional neural networks and conducted several experiments with them on the CIFAR-10 dataset. The experiments have shown faster convergence and better performance for image classification of SLU-based networks compared with rectified activation functions.
机译:深神经网络性能的目前的改进部分是由于整流线性单元的命题。 Relu激活函数输出为负组件的零,诱导一些神经元的死亡和输出的偏置偏移,这导致振荡和阻碍学习。根据“零平均激活改善学习能力”的理论,提出了一种软行性线性单元(SLU)作为自适应激活功能,可以加速学习,提高深度卷积神经网络中的性能。首先,为了减少偏置移位,使用SoftPlus功能处理负输入,提出了一种SLU功能的一般形式。其次,正部件的参数是固定的,以控制消失的梯度。第三,建立更新负组件参数的规则以满足备份要求。最后,我们设计了深度自动编码网络,并在Mnist DataSet上与他们进行了几个实验,以进行无监督的学习。对于监督学习,我们设计了深度卷积神经网络,并在CIFAR-10数据集上与它们进行了多次实验。与整流的激活功能相比,实验表明了基于SLU的网络的图像分类的更快的收敛性和更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号