首页>
外国专利>
Accelerating deep neural network training with inconsistent stochastic gradient descent
Accelerating deep neural network training with inconsistent stochastic gradient descent
展开▼
机译:不一致的随机梯度下降加速深度神经网络训练
展开▼
页面导航
摘要
著录项
相似文献
摘要
Aspects of the present disclosure describe techniques for training a convolutional neural network using an inconsistent stochastic gradient descent (ISGD) algorithm. Training effort for training batches used by the ISGD algorithm are dynamically adjusted according to a determined loss for a given training batch which are classified into two sub states—well-trained or under-trained. The ISGD algorithm provides more iterations for under-trained batches while reducing iterations for well-trained ones.
展开▼