首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints
【24h】

Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints

机译:集群联合学习:隐私约束下的模型 - 不可忽视分布式多任务优化

获取原文
获取原文并翻译 | 示例
           

摘要

Federated learning (FL) is currently the most widely adopted framework for collaborative training of (deep) machine learning models under privacy constraints. Albeit its popularity, it has been observed that FL yields suboptimal results if the local clients' data distributions diverge. To address this issue, we present clustered FL (CFL), a novel federated multitask learning (FMTL) framework, which exploits geometric properties of the FL loss surface to group the client population into clusters with jointly trainable data distributions. In contrast to existing FMTL approaches, CFL does not require any modifications to the FL communication protocol to be made, is applicable to general nonconvex objectives (in particular, deep neural networks), does not require the number of clusters to be known a priori, and comes with strong mathematical guarantees on the clustering quality. CFL is flexible enough to handle client populations that vary over time and can be implemented in a privacy-preserving way. As clustering is only performed after FL has converged to a stationary point, CFL can be viewed as a postprocessing method that will always achieve greater or equal performance than conventional FL by allowing clients to arrive at more specialized models. We verify our theoretical analysis in experiments with deep convolutional and recurrent neural networks on commonly used FL data sets.
机译:联邦学习(FL)目前是隐私约束下的(深)机器学习模型的协作培训最广泛采用的框架。尽管它的流行度,但已经观察到,如果当地客户的数据分配分歧,则FL会产生次优效果。为了解决这个问题,我们呈现集群化FL(CFL),这是一个新的联合多任务学习(FMTL)框架,它利用了流域的几何属性,将客户端群分组到具有共同培训数据分布的集群中。与现有的FMTL方法相比,CFL不需要对FL通信协议进行任何修改,适用于一般非凸起目标(特别是深神经网络),不需要群集的数量是先验的,在聚类质量上具有强大的数学保证。 CFL足够灵活,可以处理随着时间的推移而变化的客户端口,并且可以以隐私保存方式实现。由于群集仅在FL融合到静止点之后,CFL可以被视为后处理方法,该方法将始终通过允许客户到达更专业的模型来实现更大或比传统的性能更大或相等的性能。我们验证了我们在常用的流动数据集上具有深度卷积和经常性神经网络的实验的理论分析。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号