首页> 外国专利> Accelerating deep neural network training with inconsistent stochastic gradient descent

Accelerating deep neural network training with inconsistent stochastic gradient descent

机译:不一致的随机梯度下降加速深度神经网络训练

摘要

Aspects of the present disclosure describe techniques for training a convolutional neural network using an inconsistent stochastic gradient descent (ISGD) algorithm. Training effort for training batches used by the ISGD algorithm are dynamically adjusted according to a determined loss for a given training batch which are classified into two sub states—well-trained or under-trained. The ISGD algorithm provides more iterations for under-trained batches while reducing iterations for well-trained ones.
机译:本公开的各方面描述了用于使用不一致的随机梯度下降(ISGD)算法来训练卷积神经网络的技术。根据给定训练批次的确定损失,动态调整ISGD算法使用的训练批次的训练工作量,这些损失分为两个子状态:训练有素或训练不足。 ISGD算法为训练不足的批次提供了更多的迭代,同时减少了训练有素的批次的迭代。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号