...
首页> 外文期刊>Systems and Computers in Japan >On the Statistical Behavior of the Learning Error of Layered Neural Networks
【24h】

On the Statistical Behavior of the Learning Error of Layered Neural Networks

机译:分层神经网络学习错误的统计行为

获取原文
获取原文并翻译 | 示例
           

摘要

In the study of layered neural networks using the sigmoidal function as the characteristic function, certain statistical properties are still poorly understood. This study makes a theoretical and numerical comparison of this kind of network with the network using the Heaviside function as the characteristic function, in terms of the learning error. In the comparison, the Heaviside function with a ramp function is considered as a characteristic function which has properties intermediate between the two, and can be considered as a limit of the sigmoidal function. A similarity from the viewpoint of statistical properties is suggested. It is also shown that there is no significant difference in terms of learning error between the cases of the Heaviside function with and without a ramp function. This implies that the layered neural networks with the sigmoidal function and the Heaviside function in which the number of hidden layer units is 1 have similar properties, which are different from the conventional linear model.
机译:在使用S形函数作为特征函数的分层神经网络的研究中,某些统计属性仍然知之甚少。这项研究就学习误差而言,将这种网络与以Heaviside函数为特征函数的网络进行了理论和数值比较。在比较中,具有斜坡函数的Heaviside函数被认为是具有介于两者之间的特性的特征函数,并且可以被视为S形函数的极限。从统计特性的角度来看,存在相似性。还表明,在具有和不具有斜坡函数的Heaviside函数的情况下,在学习错误方面没有显着差异。这意味着具有S形函数和Heaviside函数的分层神经网络(其中隐藏层单元的数量为1)具有相似的属性,与常规线性模型不同。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号