首页> 外文会议>International Joint Conference on Neural Networks >Reducing SqueezeNet Storage Size with Depthwise Separable Convolutions
【24h】

Reducing SqueezeNet Storage Size with Depthwise Separable Convolutions

机译:通过深度可分离卷积减少SqueezeNet存储大小

获取原文

摘要

Current research in the field of convolutional neural networks usually focuses on improving network accuracy, regardless of the network size and inference time. In this paper, we investigate the effects of storage space reduction in SqueezeNet as it relates to inference time when processing single test samples. In order to reduce the storage space, we suggest adjusting SqueezeNet's Fire Modules to include Depthwise Separable Convolutions (DSC). The resulting network, referred to as SqueezeNet-DSC, is compared to different convolutional neural networks such as MobileNet, AlexNet, VGG19, and the original SqueezeNet itself. When analyzing the models, we consider accuracy, the number of parameters, parameter storage size and processing time of a single test sample on CIFAR-10 and CIFAR-100 databases. The SqueezeNet-DSC exhibited a considerable size reduction (37% the size of SqueezeNet), while experiencing a loss in network accuracy of 1,07% in CIFAR-10 and 3,06% in top 1 CIFAR-100.
机译:卷积神经网络领域中的当前研究通常集中在提高网络准确性上,而不管网络的大小和推理时间如何。在本文中,我们研究了SqueezeNet中存储空间减少的影响,因为它与处理单个测试样本时的推理时间有关。为了减少存储空间,我们建议调整SqueezeNet的火灾模块以包括深度可分离卷积(DSC)。将得到的网络称为SqueezeNet-DSC与不同的卷积神经网络(例如MobileNet,AlexNet,VGG19和原始的SqueezeNet本身)进行比较。在分析模型时,我们会考虑CIFAR-10和CIFAR-100数据库上单个测试样本的准确性,参数数量,参数存储大小和处理时间。 SqueezeNet-DSC的尺寸减小了很多(SqueezeNet的尺寸减小了37%),而CIFAR-10的网络精度损失了107%,而前1名CIFAR-100的网络精度损失了3.06%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号