首页> 外文会议>International Conference on Artificial Intelligence IC-AI'2000 Vol.3, Jun 26-29, 2000, Las Vegas, Nevada, USA >Selectively Attentive Learning of Multilayer Perceptron for Fast Speaker Adaptation
【24h】

Selectively Attentive Learning of Multilayer Perceptron for Fast Speaker Adaptation

机译:选择性感知学习的多层感知器,用于快速说话者适应

获取原文
获取原文并翻译 | 示例

摘要

We have proposed a selectively attentive learning method to improve the speed of the error backpropagation (EBP) algorithm of Multilayer Perceptron. The acceleration of learning time is not achieved by decreasing the number of iterations but by lowering the computational cost per iteration. Three attention criterions are employed to effectively determine which set of input patterns is or which portion of network is attended to for effective learning. Such criterions are based on the mean square error (MSE) function of the output layer and class-selective relevance of the hidden nodes. Effectiveness of the proposed method is demonstrated in a speaker adaptation task of isolated word recognition system. The experimental results show that the proposed selective attention technique can reduce the learning time more than 60% in an average sense.
机译:我们提出了一种选择性的专心学习方法,以提高多层感知器的错误反向传播(EBP)算法的速度。通过减少迭代次数而不是通过降低每次迭代的计算成本来实现学习时间的加速。使用三个注意标准来有效地确定输入模式的集合是哪个,或者网络的哪个部分用于有效学习。此类标准基于输出层的均方误差(MSE)函数和隐藏节点的类别选择性相关性。在孤立词识别系统的说话人适应任务中证明了该方法的有效性。实验结果表明,提出的选择性注意技术可以平均减少60%以上的学习时间。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号