首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Pseudo-Orthogonalization of Memory Patterns for Associative Memory
【24h】

Pseudo-Orthogonalization of Memory Patterns for Associative Memory

机译:关联记忆的记忆模式的伪正交化

获取原文
获取原文并翻译 | 示例
       

摘要

A new method for improving the storage capacity of associative memory models on a neural network is proposed. The storage capacity of the network increases in proportion to the network size in the case of random patterns, but, in general, the capacity suffers from correlation among memory patterns. Numerous solutions to this problem have been proposed so far, but their high computational cost limits their scalability. In this paper, we propose a novel and simple solution that is locally computable without any iteration. Our method involves XNOR masking of the original memory patterns with random patterns, and the masked patterns and masks are concatenated. The resulting decorrelated patterns allow higher storage capacity at the cost of the pattern length. Furthermore, the increase in the pattern length can be reduced through blockwise masking, which results in a small amount of capacity loss. Movie replay and image recognition are presented as examples to demonstrate the scalability of the proposed method.
机译:提出了一种提高神经网络联想记忆模型存储容量的新方法。在随机模式的情况下,网络的存储容量与网络大小成比例地增加,但是通常,该容量会受到存储模式之间的相关性的影响。迄今为止,已经提出了许多解决该问题的解决方案,但是它们的高计算成本限制了它们的可扩展性。在本文中,我们提出了一种新颖且简单的解决方案,该解决方案无需任何迭代即可进行本地计算。我们的方法涉及使用随机模式对原始内存模式进行XNOR屏蔽,然后将屏蔽的模式和掩码连接起来。最终的解相关图案允许以图案长度为代价的更高存储容量。此外,可以通过逐块掩膜来减少图案长度的增加,这导致少量的容量损失。以电影回放和图像识别为例,演示了该方法的可扩展性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号