首页> 外文会议>IEEE International Conference on Artificial Intelligence and Industrial Design >Improved Model Compression Method Based on Information Entropy
【24h】

Improved Model Compression Method Based on Information Entropy

机译:基于信息熵的改进模型压缩方法

获取原文

摘要

The rapid development of deep learning has promoted more and more complex neural network models that require high computing power. Even though researchers have proposed various lightweight network models such as MobileNet, SqueezeNet and ShuffleNet, the amount of calculation is still huge. In order to further reduce the amount of model calculations, model compression is an effective means to reduce the amount of model parameters and calculations. Channel pruning is the most effective and direct means to accelerate model calculations and reduce model parameters. However, due to its radical approach, the effect of pruning is affected by the basis for determining the importance of the channel, and the accuracy cannot be guaranteed. Furthermore, pruning When the filter is smaller than the set threshold value is completely deleted, it is possible to discard important parameters. Therefore, this article intends to propose a channel pruning model compression method based on information entropy. The actual test results give convincing experimental results, which prove the effectiveness and practicability of the method.
机译:深度学习的快速发展促进了需要高计算能力的越来越复杂的神经网络模型。尽管研究人员提出了各种轻量级网络模型,如MobileNet,Screezenet和Shuffleenet,但计算量仍然是巨大的。为了进一步减少模型计算量,模型压缩是减少模型参数和计算量的有效手段。通道修剪是加速模型计算和降低模型参数的最有效和最有效的手段。然而,由于其激进的方法,修剪的效果受到确定渠道重要性的基础,并且无法保证准确性。此外,在滤波器小于设定阈值时修剪被完全删除,可以丢弃重要的参数。因此,本文旨在提出基于信息熵的信道修剪模型压缩方法。实际测试结果给出了令人信服的实验结果,这证明了该方法的有效性和实用性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号