首页> 外文期刊>IEEE Transactions on Parallel and Distributed Systems >Mutual Information Driven Federated Learning
【24h】

Mutual Information Driven Federated Learning

机译:相互信息驱动的联邦学习

获取原文
获取原文并翻译 | 示例
           

摘要

Federated Learning (FL) is an emerging research field that yields a global trained model from different local clients without violating data privacy. Existing FL techniques often ignore the effective distinction between local models and the aggregated global model when doing the client-side weight update, as well as the distinction of local models for the server-side aggregation. In this article, we propose a novel FL approach with resorting to mutual information (MI). Specifically, in client-side, the weight update is reformulated through minimizing the MI between local and aggregated models and employing Negative Correlation Learning (NCL) strategy. In server-side, we select top effective models for aggregation based on the MI between an individual local model and its previous aggregated model. We also theoretically prove the convergence of our algorithm. Experiments conducted on MNIST, CIFAR-10, ImageNet, and the clinical MIMIC-III datasets manifest that our method outperforms the state-of-the-art techniques in terms of both communication and testing performance.
机译:联合学习(FL)是一个新兴的研究领域,从不同的本地客户端产生全球培训的模型,而无需违反数据隐私。现有的流程通常忽略当地模型和聚合的全局模型之间的有效区别在执行客户端权重更新时,以及服务器端聚合的本地模型的区别。在本文中,我们提出了一种新颖的流域,并借助相互信息(MI)。具体地,在客户端,通过最小化局部和聚合模型之间的MI来重写权重更新,并采用负相关学习(NCL)策略。在服务器端,我们选择基于单个本地模型与之前的聚合模型之间的MI的MI的聚合的最佳有效模型。我们也理论上证明了我们算法的融合。在Mnist,CiFar-10,ImageNet上进行的实验和临床模拟 - III数据集表明,我们的方法在通信和测试性能方面优于最先进的技术。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号