首页> 外文学位 >A comparison of stochastic global optimization methods: Estimating neural network weights.
【24h】

A comparison of stochastic global optimization methods: Estimating neural network weights.

机译:随机全局优化方法的比较:估计神经网络权重。

获取原文
获取原文并翻译 | 示例

摘要

Scope and method of study. The general objective of this study was to determine the speed and accuracy of alternative global optimization methods in relation to a local algorithm for estimating the weights of neural networks. The specific objective was to determine the relative speed and accuracy of nine alternative stochastic global optimization algorithms in relation to a local optimization algorithm for estimating the weights of neural networks. Comparisons of the algorithms were made by performing multiple estimations from random starting values on six function approximation problems and analyzing the running time and distribution of the final objective function values over the multiple estimations.; Findings and conclusions. The results indicated that no single algorithm dominated all others across the training data sets. More importantly, with respect to the research objectives of this study, the local optimization algorithm was not consistently dominated by any of the stochastic global optimization algorithms. On average, the global algorithms marginally outperformed the local algorithm in obtaining a lower local minimum, however, the global algorithms required more computational resources. Therefore, relative to a global algorithm a local algorithm could perform a greater number of restarts increasing the relative performance of the local algorithm. Thus the results in this study indicate that with respect to the specific algorithms studied, there is little evidence to show that a global algorithm should be used over a more traditional local optimization routine for training neural networks. Further, the results indicated that a large number of local minimums exist for all the neural network training data sets considered in this study. Therefore, neural networks should not be estimated from a single set of starting values whether a global or local optimization method is used.
机译:研究范围和方法。这项研究的总体目标是确定相对于用于估计神经网络权重的局部算法的全局优化方法的速度和准确性。具体目标是确定与估计神经网络权重的局部优化算法有关的九种替代随机全局优化算法的相对速度和准确性。通过对六个函数逼近问题的随机起始值进行多次估计,并分析运行时间和最终目标函数值在多次估计中的分布,对算法进行比较。结论和结论。结果表明,没有一个算法能在训练数据集中控制所有其他算法。更重要的是,就本研究的研究目标而言,局部优化算法并非始终由任何随机全局优化算法主导。平均而言,全局算法在获得较低的局部最小值方面略胜于局部算法,但是,全局算法需要更多的计算资源。因此,相对于全局算法,本地算法可以执行更多次重新启动,从而提高了本地算法的相对性能。因此,本研究的结果表明,相对于所研究的特定算法,几乎没有证据表明在用于训练神经网络的更传统的局部优化例程上应使用全局算法。此外,结果表明,本研究中考虑的所有神经网络训练数据集均存在大量局部最小值。因此,无论使用全局优化方法还是局部优化方法,都不应该从一组初始值中估计神经网络。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号