首页> 外文期刊>IFAC PapersOnLine >Weight Initialization Possibilities for Feedforward Neural Network with Linear Saturated Activation Functions
【24h】

Weight Initialization Possibilities for Feedforward Neural Network with Linear Saturated Activation Functions

机译:具有线性饱和激活函数的前馈神经网络的权重初始化可能性

获取原文
           

摘要

Abstract: Initial weight choice is an important aspect of the training mechanism for feedforward neural networks. This paper deals with a particular topology of a feedforward neural network, where symmetric linear saturated activation functions are used in a hidden layer. Training of such a topology is a tricky procedure, since the activation functions are not fully differentiable. Thus, a proper initialization method for that case is even more important, than dealing with neural networks with sigmoid activation functions. Therefore, several initialization possibilities are examined and tested here. As a result, particular initialization methods are recommended for application, according to the class of the task to be solved.
机译:摘要:初始权重的选择是前馈神经网络训练机制的重要方面。本文讨论了前馈神经网络的特定拓扑,其中在隐藏层中使用了对称线性饱和激活函数。训练这样的拓扑结构是一个棘手的过程,因为激活功能不能完全区分。因此,对于这种情况,合适的初始化方法比处理具有S型激活函数的神经网络更为重要。因此,在此检查并测试了几种初始化可能性。结果,根据要解决的任务的类别,建议使用特定的初始化方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号