首页> 外文会议>Youth Academic Annual Conference of Chinese Association of Automation >Federated Optimization Based on Compression and Event-triggered Communication
【24h】

Federated Optimization Based on Compression and Event-triggered Communication

机译:基于压缩和事件触发通信的联合优化

获取原文

摘要

Federated learning is deemed as a promising solution to large-scale machine learning problem, which enables multiple edge users cooperatively to train a global parameter model and guarantees users a basic privacy level. However, despite the increasing interest, communication expenditure usually turns out to be a major bottleneck for scaling up distributed algorithms with possibly irresponsible or limited data rate network environments. In fact, users or clients synchronize models periodically regardless of whether current models change significantly from last one, it is a waste of communication resources. Considering how much message to transmit each communication round and when to communicate, in this paper, we propose FedCET, which is a compression and event-triggered algorithm for federated learning. We present convergence analysis of algorithm and rigorous proof for smooth nonconvex, strong convex or PL condition and general convex objective functions, respectively, testifying that such communication method is efficient without affecting the convergence property of the algorithm. Further, we evaluate the proposed FedCET on several dataset to demonstrate the effectiveness compared with other methods.
机译:联邦学习被视为大规模机器学习问题的有希望的解决方案,这使得多个边缘用户能够协同培训全局参数模型,并保证用户基本隐私级别。然而,尽管越来越感兴趣,但沟通支出通常证明是缩放分布式算法的主要瓶颈,可以使用可能不负责任或有限的数据速率网络环境来扩展分布式算法。事实上,用户或客户端定期同步模型,无论当前模型是否从最后一个更改,它都是浪费通信资源。考虑到在本文中发送每个通信的消息以及何时进行通信,我们提出FEDCET,这是一种用于联合学习的压缩和事件触发算法。我们分别呈现了算法和严格证据的收敛性分析,分别为平滑的非凸起,强凸或PL条件和一般凸面函数,证明了这种通信方法有效而不影响算法的收敛性。此外,我们在几个数据集上评估所提出的FEDCET以证明与其他方法相比的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号