...
首页> 外文期刊>Journal of Low Power Electronics >Efficient and Low Overhead Memristive Activation Circuit for Deep Learning Neural Networks
【24h】

Efficient and Low Overhead Memristive Activation Circuit for Deep Learning Neural Networks

机译:深度学习神经网络的高效和低开销椎间膜激活电路

获取原文
获取原文并翻译 | 示例
           

摘要

An efficient memristor MIN function based activation circuit is presented for memristive neuromorphic systems, using only two memristors and a comparator. The ReLU activation function is approximated using this circuit. The ReLU activation function helps to significantly reduce the time and computational cost of training in neuromorphic systems due to its simplicity and effectiveness in deep neural networks. A multilayer neural network is simulated using this activation circuit in addition to traditional memristor crossbar arrays. The results illustrate that the proposed circuit is able to perform training effectively with significant savings in time and area in memristor crossbar based neural networks.
机译:仅使用两个忆阻器和比较器的椎间膜神经晶体系统给出了一种有效的映射器MIN功能基于激活电路。 使用该电路近似Relu激活函数。 由于深神经网络中的简单性和有效性,Relu激活功能有助于显着降低神经胸系统中训练的时间和计算成本。 除了传统的Memristor Crossbar阵列之外,使用该激活电路模拟多层神经网络。 结果说明了所提出的电路能够在基于Memristor横杆的神经网络中有效地节省时间和区域来执行训练。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号