首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Concurrent Subspace Width Optimization Method for RBF Neural Network Modeling
【24h】

Concurrent Subspace Width Optimization Method for RBF Neural Network Modeling

机译:RBF神经网络建模的并行子空间宽度优化方法

获取原文
获取原文并翻译 | 示例
       

摘要

Radial basis function neural networks (RBFNNs) are widely used in nonlinear function approximation. One of the challenges in RBFNN modeling is determining how to effectively optimize width parameters to improve approximation accuracy. To solve this problem, a width optimization method, concurrent subspace width optimization (CSWO), is proposed based on a decomposition and coordination strategy. This method decomposes the large-scale width optimization problem into several subspace optimization (SSO) problems, each of which has a single optimization variable and smaller training and validation data sets so as to greatly simplify optimization complexity. These SSOs can be solved concurrently, thus computational time can be effectively reduced. With top-level system coordination, the optimization of SSOs can converge to a consistent optimum, which is equivalent to the optimum of the original width optimization problem. The proposed method is tested with four mathematical examples and one practical engineering approximation problem. The results demonstrate the efficiency and robustness of CSWO in optimizing width parameters over the traditional width optimization methods.
机译:径向基函数神经网络(RBFNN)广泛用于非线性函数逼近。 RBFNN建模的挑战之一是确定如何有效优化宽度参数以提高近似精度。为了解决这个问题,提出了一种基于分解协调策略的宽度优化方法,并发子空间宽度优化(CSWO)。该方法将大规模宽度优化问题分解为若干子空间优化(SSO)问题,每个子空间具有单个优化变量以及较小的训练和验证数据集,从而极大地简化了优化复杂性。这些SSO可以同时解决,因此可以有效地减少计算时间。通过顶级系统协调,SSO的优化可以收敛到一致的最优值,这等效于原始宽度优化问题的最优值。通过四个数学示例和一个实际的工程逼近问题对提出的方法进行了测试。结果证明了CSWO在优化宽度参数上的效率和鲁棒性优于传统的宽度优化方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号