首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >An Accelerated Linearly Convergent Stochastic L-BFGS Algorithm
【24h】

An Accelerated Linearly Convergent Stochastic L-BFGS Algorithm

机译:加速线性收敛的随机L-BFGS算法

获取原文
获取原文并翻译 | 示例
           

摘要

The limited memory version of the Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) algorithm is the most popular quasi-Newton algorithm in machine learning and optimization. Recently, it was shown that the stochastic L-BFGS (sL-BFGS) algorithm with the variance-reduced stochastic gradient converges linearly. In this paper, we propose a new sL-BFGS algorithm by importing a proper momentum. We prove an accelerated linear convergence rate under mild conditions. The experimental results on different data sets also verify this acceleration advantage.
机译:Broyden-Fletcher-Goldfarb-Shanno(L-BFGS)算法的受限内存版本是机器学习和优化中最流行的准牛顿算法。最近,证明了具有减少方差的随机梯度的随机L-BFGS(sL-BFGS)算法线性收敛。在本文中,我们通过导入适当的动量来提出一种新的sL-BFGS算法。我们证明了在温和条件下的加速线性收敛速度。在不同数据集上的实验结果也证明了这种加速优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号