...
首页> 外文期刊>Natural Computing >Finite memory loading in hairy neurons
【24h】

Finite memory loading in hairy neurons

机译:毛发神经元的有限记忆负荷

获取原文
获取原文并翻译 | 示例
           

摘要

This paper presents a method to expand the basins of stable patterns in associative memory. It examines fully-connected associative memory geometrically and translate the learning process into an algebraic optimization procedure. It finds that locating all the patterns at certain stable corners of the neurons' hypercube as far from the decision hyperplanes as possible can produce excellent error tolerance. It then devises a method based on this finding to develop the hyperplanes. This paper further shows that this method leads to the hairy model, or the deterministic analogue of the Gibb's free energy model. Through simulations, it shows that this method gives better error tolerance than does the Hopfield model and the error-correction rule in both synchronous and asynchronous modes.
机译:本文提出了一种在联想记忆中扩展稳定模式盆地的方法。它以几何方式检查完全连接的联想记忆,并将学习过程转换为代数优化过程。结果发现,将所有模式定位在神经元超立方体的某些稳定角处,尽可能远离决策超平面,可以产生出色的错误容忍度。然后,基于此发现设计一种方法来开发超平面。本文进一步表明,该方法导致了多毛模型或吉布自由能模型的确定性模拟。通过仿真表明,与同步和异步模式下的Hopfield模型和纠错规则相比,该方法具有更好的容错能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号