...
首页> 外文期刊>IEEE Transactions on Neural Networks >Beyond Feedforward Models Trained by Backpropagation: A Practical Training Tool for a More Efficient Universal Approximator
【24h】

Beyond Feedforward Models Trained by Backpropagation: A Practical Training Tool for a More Efficient Universal Approximator

机译:通过反向传播训练获得超越前馈模型的方法:实用的训练工具,用于更高效的通用逼近器

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Cellular simultaneous recurrent neural network (SRN) has been shown to be a function approximator more powerful than the multilayer perceptron (MLP). This means that the complexity of MLP would be prohibitively large for some problems while SRN could realize the desired mapping with acceptable computational constraints. The speed of training of complex recurrent networks is crucial to their successful application. This work improves the previous results by training the network with extended Kalman filter (EKF). We implemented a generic cellular SRN (CSRN) and applied it for solving two challenging problems: 2-D maze navigation and a subset of the connectedness problem. The speed of convergence has been improved by several orders of magnitude in comparison with the earlier results in the case of maze navigation, and superior generalization has been demonstrated in the case of connectedness. The implications of this improvements are discussed.
机译:已经证明,细胞同时递归神经网络(SRN)是比多层感知器(MLP)更强大的函数逼近器。这意味着对于某些问题,MLP的复杂性将过大,而SRN可以在可接受的计算约束下实现所需的映射。复杂的循环网络的训练速度对于其成功应用至关重要。这项工作通过使用扩展卡尔曼滤波器(EKF)训练网络来改善先前的结果。我们实现了通用蜂窝SRN(CSRN),并将其用于解决两个难题:二维迷宫导航和一部分连通性问题。与迷宫导航情况下的早期结果相比,收敛速度提高了几个数量级,在连通性情况下已证明了优越的概括性。讨论了这种改进的含义。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号