首页> 外文期刊>NTT Technical Review >Research on Asynchronous Distributed Deep Learning Technology―Optimizing Machine Learning Models in the Age of Distributed Data Storage
【24h】

Research on Asynchronous Distributed Deep Learning Technology―Optimizing Machine Learning Models in the Age of Distributed Data Storage

机译:分布式数据存储时代异步分布式深度学习技术优化机器学习模型研究

获取原文
           

摘要

While modern deep learning often requires aggregating data into a single datacenter to train models, in the near future data will be distributed due to increased data volume and privacy protection concern. In this article, we spoke to Kenta Niwa, a distinguished researcher working on asynchronous distributed deep learning technology. This technology allows us to optimize machine learning models as if the data was aggregated in a single datacenter, even in the modern era of distributed data.
机译:虽然现代深度学习经常需要将数据聚集成单个数据中心来训练模型,但在不久的将来数据将由于数据量增加和隐私保护问题而分布。 在本文中,我们与肯塔塔·纳沃(Kenta Niwa)发表介绍异步分布式深度学习技术的杰出研究员。 该技术允许我们优化机器学习模型,好像数据在单个数据中心聚合,即使在分布式数据的现代时代也是如此。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号