首页> 外文期刊>Neural computation >A Mathematical Analysis Of The Effects Of Hebbianrnlearning Rules On The Dynamics And Structurernof Discrete-time Random Recurrent Neural Networks
【24h】

A Mathematical Analysis Of The Effects Of Hebbianrnlearning Rules On The Dynamics And Structurernof Discrete-time Random Recurrent Neural Networks

机译:Hebbianrnle学习规则对离散时间随机递归神经网络动力学和结构影响的数学分析

获取原文
获取原文并翻译 | 示例
           

摘要

We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.
机译:我们对随机回归神经网络中的Hebbian学习的影响进行数学分析,并采用通用的Hebbian学习规则,包括被动遗忘和不同时标,以促进神经元活动和学习动态。先前的数值研究报告说,通过一系列分叉,Hebbian学习将系统从混沌驱动到稳态。在这里,我们用数学方法解释这些结果,并表明可以使用Jacobian矩阵来分析涉及神经元动力学和突触图结构之间复杂耦合的这些影响,从而介绍神经网络演化的结构和动力学观点。此外,我们表明,当最大Lyapunov指数接近0时,对学习模式的敏感性最大。我们讨论了神经网络如何利用这种对功能感兴趣的机制。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号