首页> 外文期刊>IEEE Signal Processing Magazine >Optimally Compressed Nonparametric Online Learning: Tradeoffs between memory and consistency
【24h】

Optimally Compressed Nonparametric Online Learning: Tradeoffs between memory and consistency

机译:最佳地压缩非参数在线学习:内存与一致性之间的权衡

获取原文
获取原文并翻译 | 示例
           

摘要

Batch training of machine learning models based on neural networks is well established, whereas, to date, streaming methods are largely based on linear models. To go beyond linear in the online setting, nonparametric methods are of interest due to their universality and ability to stably incorporate new information via convexity or Bayes's rule. Unfortunately, when applied online, nonparametric methods suffer a "curse of dimensionality," which precludes their use: their complexity scales at least with the time index. We survey online compression tools that bring their memory under control and attain approximate convergence. The asymptotic bias depends on a compression parameter that trades off memory and accuracy. Applications to robotics, communications, economics, and power are discussed as well as extensions to multiagent systems.
机译:基于神经网络的机器学习模型的批量培训已经很好地建立了很好的成熟,而迄今为止,流式方法主要基于线性模型。为了超越在线环境中的线性,由于其普遍性和能力通过凸起或贝叶斯的规则稳定地结合新信息的能力,非参数。遗憾的是,当在线应用时,非参数方法遭受“维度的诅咒”,这排除了它们的使用:它们的复杂性至少在时间指数中缩放。我们调查了在线压缩工具,使其内存在控制下并获得近似趋同。渐近偏差取决于交易记忆和准确性的压缩参数。讨论了对机器人,通信,经济学和权力的应用以及多层系统的扩展。

著录项

  • 来源
    《IEEE Signal Processing Magazine》 |2020年第3期|61-70|共10页
  • 作者单位

    US Army Res Lab Computat & Informat Sci Directorate Adelphi MD 20783 USA;

    US Army Res Lab Computat & Informat Sci Directorate Adelphi MD 20783 USA;

    Indian Inst Technol Kanpur Dept Elect Engn Kanpur Uttar Pradesh India;

    Army Res Lab Intelligent Syst Adelphi MD USA|Army Res Lab Adelphi MD USA;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号