首页> 美国卫生研究院文献>Frontiers in Neuroscience >A Soft-Pruning Method Applied During Training of Spiking Neural Networks for In-memory Computing Applications
【2h】

A Soft-Pruning Method Applied During Training of Spiking Neural Networks for In-memory Computing Applications

机译:一种用于内存计算应用的尖峰神经网络训练中的软修剪方法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Inspired from the computational efficiency of the biological brain, spiking neural networks (SNNs) emulate biological neural networks, neural codes, dynamics, and circuitry. SNNs show great potential for the implementation of unsupervised learning using in-memory computing. Here, we report an algorithmic optimization that improves energy efficiency of online learning with SNNs on emerging non-volatile memory (eNVM) devices. We develop a pruning method for SNNs by exploiting the output firing characteristics of neurons. Our pruning method can be applied during network training, which is different from previous approaches in the literature that employ pruning on already-trained networks. This approach prevents unnecessary updates of network parameters during training. This algorithmic optimization can complement the energy efficiency of eNVM technology, which offers a unique in-memory computing platform for the parallelization of neural network operations. Our SNN maintains ~90% classification accuracy on the MNIST dataset with up to ~75% pruning, significantly reducing the number of weight updates. The SNN and pruning scheme developed in this work can pave the way toward applications of eNVM based neuro-inspired systems for energy efficient online learning in low power applications.
机译:受生物大脑的计算效率的启发,尖峰神经网络(SNN)模拟了生物神经网络,神经代码,动力学和电路。 SNN具有使用内存计算实现无监督学习的巨大潜力。在这里,我们报告了一种算法优化,该算法优化了新兴的非易失性存储器(eNVM)设备上的SNN的在线学习的能效。我们通过利用神经元的输出触发特性为SNN开发一种修剪方法。我们的修剪方法可以在网络训练期间应用,这与文献中以前的方法不同,后者在已经训练的网络上采用修剪方法。这种方法可以防止训练期间网络参数的不必要更新。这种算法优化可以补充eNVM技术的能源效率,后者为神经网络操作的并行化提供了独特的内存计算平台。我们的SNN可以在MNIST数据集上保持约90%的分类准确性,而修剪率则可以达到约75%,从而大大减少了权重更新的次数。在这项工作中开发的SNN和修剪方案可以为基于eNVM的神经启发系统在低功耗应用中进行节能在线学习铺平道路。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号