首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >A Note on the Unification of Adaptive Online Learning
【24h】

A Note on the Unification of Adaptive Online Learning

机译:关于自适应在线学习统一的注记

获取原文
获取原文并翻译 | 示例
           

摘要

In online convex optimization, adaptive algorithms, which can utilize the second-order information of the loss function's (sub)gradient, have shown improvements over standard gradient methods. This paper presents a framework Follow the Bregman Divergence Leader that unifies various existing adaptive algorithms from which new insights are revealed. Under the proposed framework, two simple adaptive online algorithms with improvable performance guarantee are derived. Furthermore, a general equation derived from a matrix analysis generalizes the adaptive learning to nonlinear case with kernel trick.
机译:在在线凸优化中,自适应算法可以利用损失函数的(子)梯度的二阶信息,已显示出对标准梯度方法的改进。本文提出了一个遵循Bregman Divergence Leader的框架,该框架统一了各种现有的自适应算法,从而揭示了新的见解。在提出的框架下,推导了两种具有改进性能保证的简单自适应在线算法。此外,从矩阵分析得出的一般方程式将自适应学习推广到具有核技巧的非线性情况。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号