...
首页> 外文期刊>Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on >Developing Learning Algorithms via Optimized Discretization of Continuous Dynamical Systems
【24h】

Developing Learning Algorithms via Optimized Discretization of Continuous Dynamical Systems

机译:通过连续动力系统的优化离散化开发学习算法

获取原文
获取原文并翻译 | 示例
           

摘要

Most of the existing numerical optimization methods are based upon a discretization of some ordinary differential equations. In order to solve some convex and smooth optimization problems coming from machine learning, in this paper, we develop efficient batch and online algorithms based on a new principle, i.e., the optimized discretization of continuous dynamical systems (ODCDSs). First, a batch learning projected gradient dynamical system with Lyapunov's stability and monotonic property is introduced, and its dynamical behavior guarantees the accuracy of discretization-based optimizer and applicability of line search strategy. Furthermore, under fair assumptions, a new online learning algorithm achieving regret $O(sqrt{T})$ or $O(logT)$ is obtained. By using the line search strategy, the proposed batch learning ODCDS exhibits insensitivity to the step sizes and faster decrease. With only a small number of line search steps, the proposed stochastic algorithm shows sufficient stability and approximate optimality. Experimental results demonstrate the correctness of our theoretical analysis and efficiency of our algorithms.
机译:大多数现有的数值优化方法都是基于一些常微分方程的离散化。为了解决机器学习带来的一些凸且平滑的优化问题,在本文中,我们基于新原理(即连续动态系统(ODCDS)的优化离散化)开发了高效的批处理和在线算法。首先,介绍了具有Lyapunov稳定性和单调性的批量学习投影梯度动力学系统,其动力学行为保证了基于离散化的优化器的准确性和线性搜索策略的适用性。此外,在公平的假设下,获得了一种实现后悔$ O(sqrt {T})$或$ O(logT)$的新在线学习算法。通过使用线搜索策略,建议的批处理学习ODCDS表现出对步长不敏感并且更快地减小。仅用少量的线搜索步骤,所提出的随机算法就显示出足够的稳定性和近似最优性。实验结果证明了我们理论分析的正确性和算法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号