...
首页> 外文期刊>Automatic Control, IEEE Transactions on >Stochastic Gradient-Push for Strongly Convex Functions on Time-Varying Directed Graphs
【24h】

Stochastic Gradient-Push for Strongly Convex Functions on Time-Varying Directed Graphs

机译:时变有向图上强凸函数的随机梯度推

获取原文
获取原文并翻译 | 示例
           

摘要

We investigate the convergence rate of the recently proposed subgradient-push method for distributed optimization over time-varying directed graphs. The subgradient-push method can be implemented in a distributed way without requiring knowledge of either the number of agents or the graph sequence; each node is only required to know its out-degree at each time. Our main result is a convergence rate of O((lnt)/t) for strongly convex functions with Lipschitz gradients even if only stochastic gradient samples are available; this is asymptotically faster than the O((lnt)/t√) rate previously known for (general) convex functions.
机译:我们研究时变有向图上最近提出的用于分布优化的次梯度推方法的收敛速度。次梯度推方法可以以分布式方式实现,而无需了解代理数或图序列的知识。每个节点仅需要每次都知道其出学位。我们的主要结果是,即使只有随机梯度样本可用,对于具有Lipschitz梯度的强凸函数,收敛速度为O((lnt)/ t)。这比以前对于(一般)凸函数的O((lnt)/t√)速率渐近地快。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号