首页> 外文期刊>IEEE Transactions on Parallel and Distributed Systems >Accelerating Gossip-Based Deep Learning in Heterogeneous Edge Computing Platforms
【24h】

Accelerating Gossip-Based Deep Learning in Heterogeneous Edge Computing Platforms

机译:在异构边缘计算平台中加速基于八卦的深度学习

获取原文
获取原文并翻译 | 示例
           

摘要

With the exponential growth of data created at the network edge, decentralized and Gossip-based training of deep learning (DL) models on edge computing (EC) gains tremendous research momentum, owing to its capability to learn from resource-strenuous edge nodes with limited network connectivity. Today's edge devices are extremely heterogeneous, e.g., hardware and software stacks, and result in high performance variation of training time and inducing extra delay to synchronize and converge. The large body of prior art accelerates DL, being data or model parallelization, via a centralized server, e.g., parameter server scheme, which may easily turn into the system bottleneck or single point of failure. In this artice, we propose EdgeGossip, a framework specifically designed to accelerate the training process of decentralized and Gossip-based DL training for heterogeneous EC platforms. EdgeGossip features on: (i) low performance variation among multiple EC platforms during iterative training, and (ii) accuracy-aware training to fastly obtain best possible model accuracy. We implement EdgeGossip based on popular Gossip algorithms and demonstrate its effectiveness using real-world DL workloads, i.e., considerably reducing model training time by an average of 2.70 times while only incurring accuracy losses of 0.78 percent.
机译:随着在网络边缘的数据上创建的数据的指数增长,在边缘计算的分散和基于八卦的深度学习培训(DL)模型(EC)增加了巨大的研究势头,由于其具有有限的资源剧烈边缘节点的能力网络连接。今天的边缘设备非常异构,例如硬件和软件堆栈,导致培训时间的高性能变化,并诱导额外延迟以同步和收敛。现有技术的大主体通过集中式服务器(例如参数服务器方案)加速DL,是数据或模型并行化,例如参数服务器方案,这可能很容易地进入系统瓶颈或单点故障。在本件中,我们提出了EdgeGossip,该框架专门旨在加速异构EC平台的分散和基于Gossip的DL培训的培训过程。 EdgeGossip上的特点:(i)迭代培训期间多个EC平台的低性能变化,(ii)准确性感知培训,以迅速获得最佳的模型精度。我们基于流行的Gossip算法实现EdgeGossip,并使用现实世界DL工作负载,即大大减少模型训练时间,平均展示其有效性,平均较2.70倍,同时仅产生0.78%的精度损失。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号