...
首页> 外文期刊>IEEE Transactions on Information Theory >On the number of memories that can be perfectly stored in a neural net with Hebb weights
【24h】

On the number of memories that can be perfectly stored in a neural net with Hebb weights

机译:利用Hebb权重可以完美存储在神经网络中的记忆数量

获取原文
获取原文并翻译 | 示例
           

摘要

Let (w/sub ij/) be the weights of the connections of a neural network with n nodes, calculated from m data vectors v/sup 1/, ..., v/sup m/ in (1,-1)/sup n/, according to the Hebb rule. The author proves that if m is not too large relative to n and the v/sup k/ are random, then the w/sub ij/ constitute, with high probability, a perfect representation of the v/sup k/ in the sense that the v/sup k/ are completely determined by the w/sub ij/ up to their sign. The conditions under which this is established turn out to be less restrictive than those under which it has been shown that the v/sup k/ can actually be recovered by letting the network evolve until equilibrium is attained. In the specific case where the entries of the v/sup k/ are independent and equal to 1 or -1 with probability 1/2, the condition on m is that m should not exceed n/0.7 log n.
机译:令(w / sub ij /)是具有n个节点的神经网络的连接权重,它是根据m数据向量v / sup 1 /,...,v / sup m / in(1,-1)/根据Hebb规则。作者证明,如果m相对于n不太大并且v / sup k /是随机的,则w / sub ij /很有可能构成v / sup k /的完美表示。 v / sup k /完全由w / sub ij /决定,直至其符号。事实证明,建立这种条件的条件要比已经证明通过使网络演化直到达到平衡而实际上可以恢复v / sup k /的条件限制要小。在特定情况下,v / sup k /的项是独立的,并且以1/2的概率等于1或-1,m的条件是m不应超过n / 0.7 log n。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号