...
首页> 外文期刊>Neural Networks, IEEE Transactions on >Large Memory Capacity in Chaotic Artificial Neural Networks: A View of the Anti-Integrable Limit
【24h】

Large Memory Capacity in Chaotic Artificial Neural Networks: A View of the Anti-Integrable Limit

机译:混沌人工神经网络中的大存储容量:反积分极限的观点

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

In the literature, it was reported that the chaotic artificial neural network model with sinusoidal activation functions possesses a large memory capacity as well as a remarkable ability of retrieving the stored patterns, better than the conventional chaotic model with only monotonic activation functions such as sigmoidal functions. This paper, from the viewpoint of the anti-integrable limit, elucidates the mechanism inducing the superiority of the model with periodic activation functions that includes sinusoidal functions. Particularly, by virtue of the anti-integrable limit technique, this paper shows that any finite-dimensional neural network model with periodic activation functions and properly selected parameters has much more abundant chaotic dynamics that truly determine the model's memory capacity and pattern-retrieval ability. To some extent, this paper mathematically and numerically demonstrates that an appropriate choice of the activation functions and control scheme can lead to a large memory capacity and better pattern-retrieval ability of the artificial neural network models.
机译:在文献中,据报道,具有正弦激活函数的混沌人工神经网络模型具有较大的存储容量,并且具有显着的存储模式检索能力,优于仅具有单调激活函数(如S形函数)的常规混沌模型。 。从反积分极限的角度出发,本文阐述了利用周期性激活函数(包括正弦函数)来诱导模型优越性的机制。特别是,借助反积分极限技术,本文表明,具有周期性激活函数和正确选择参数的有限维神经网络模型具有更丰富的混沌动力学,可以真正确定模型的存储能力和模式检索能力。在某种程度上,本文从数学和数值上证明了激活函数和控制方案的适当选择可以导致人工神经网络模型的大存储容量和更好的模式检索能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号