首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >On the Working Principle of the Hopfield Neural Networks and its Equivalence to the GADIA in Optimization
【24h】

On the Working Principle of the Hopfield Neural Networks and its Equivalence to the GADIA in Optimization

机译:论Hopfield神经网络的工作原理及其在优化中与纳迪亚的等价

获取原文
获取原文并翻译 | 示例
           

摘要

Hopfield neural networks (HNNs) are one of the most well-known and widely used kinds of neural networks in optimization. In this article, the author focuses on building a deeper understanding of the working principle of the HNN during an optimization process. Our investigations yield several novel results giving some important insights into the working principle of both continuous and discrete HNNs. This article shows that what the traditional HNN actually does as energy function decreases is to divide the neurons into two classes in such a way that the sum of biased class volumes is minimized (or maximized) regardless of the types of the optimization problems. Introducing neuron-specific class labels, the author concludes that the traditional discrete HNN is actually a special case of the greedy asynchronous distributed interference avoidance algorithm (GADIA) [17] of Babadi and Tarokh for the 2-class optimization problems. The computer results confirm the findings.
机译:Hopfield神经网络(HNNS)是优化中最着名和广泛使用的神经网络之一。在本文中,作者侧重于在优化过程中建立更深入的了解HNN的工作原理。我们的调查产生了几种新颖的结果,对连续和离散HNN的工作原理提供了一些重要的见解。本文表明,传统的HNN实际上作为能量函数的方式减少是将神经元分成两类,使得偏置类卷的总和最小化(或最大化),而不管优化问题的类型如何。介绍神经元特定的类标签,作者得出结论,传统的离散HNN实际上是Babadi和Tarokh的贪婪异步分布式干扰避免估算算法(GADIA)[17]的特殊情况,为2级优化问题。计算机结果确认了调查结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号