首页> 外文期刊>American journal of applied sciences >Memoryless Modified Symmetric Rank-One Method for Large-Scale Unconstrained Optimization | Science Publications
【24h】

Memoryless Modified Symmetric Rank-One Method for Large-Scale Unconstrained Optimization | Science Publications

机译:大规模无约束优化的无记忆修正对称秩一方法科学出版物

获取原文
           

摘要

> Problem statement: Memoryless QN methods have been regarded effective techniques for solving large-scale problems that can be considered as one step limited memory QN methods. In this study, we present a scaled memoryless modified Symmetric Rank-One (SR1) algorithm and investigate the numerical performance of the proposed algorithm for solving large-scale unconstrained optimization problems. Approach: The basic idea is to apply the modified Quasi-Newton (QN) equations, which uses both the gradients and the function values in two successive points in the frame of the scaled memoryless SR1 update, in which the modified SR1 update is reset, at every iteration, to the positive multiple of the identity matrix. The scaling of the identity is chosen such that the positive definiteness of the memoryless modified SR1 update is preserved. Results: Under some suitable conditions, the global convergence and rate of convergence are established. Computational results, for a test set consisting of 73 unconstrained optimization problems, show that the proposed algorithm is very encouraging. Conclusion/Recommendations: In this study a memoryless QN method developed for solving large-scale unconstrained optimization problems, in which the SR1 update based on the modified QN equation have applied. An important feature of the proposed method is that it preserves positive definiteness of the updates. The presented method owns global and R-linear convergence. Numerical results showed that the proposed method is encouraging comparing with the methods MMBFGS and FRCG.
机译: > 问题陈述:无记忆QN方法被认为是解决大规模问题的有效技术,可以视为一步一步的有限记忆QN方法。在这项研究中,我们提出了一种比例缩放的无内存改进的对称秩一(SR1)算法,并研究了所提出算法解决大规模无约束优化问题的数值性能。 方法:基本思想是应用修改后的拟牛顿(QN)方程,该方程在缩放的无内存SR1更新帧的两个连续点中同时使用梯度和函数值,其中修改后的SR1更新在每次迭代时都会重置为单位矩阵的正数倍。选择标识的缩放比例,以便保留无内存修改的SR1更新的正定性。 结果:在某些合适的条件下,建立了全局收敛性和收敛速度。对于包含73个无约束优化问题的测试集的计算结果表明,提出的算法非常令人鼓舞。 结论/建议:在本研究中,开发了一种用于解决大规模无约束优化问题的无记忆QN方法,其中应用了基于改进QN方程的SR1更新。所提出的方法的一个重要特征是,它保留了更新的正确定性。该方法具有全局收敛性和R线性收敛性。数值结果表明,与MMBFGS和FRCG方法相比,该方法具有启发性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号