首页> 外文会议>International Conference on Artificial Neural Networks >Learning Internal Dense But External Sparse Structures of Deep Convolutional Neural Network
【24h】

Learning Internal Dense But External Sparse Structures of Deep Convolutional Neural Network

机译:学习深度卷积神经网络的内部密集但外部稀疏结构

获取原文

摘要

Recent years have witnessed two seemingly opposite developments of deep convolutional neural networks (CNNs). On the one hand, increasing the density of CNNs (e.g., by adding cross-layer connections) achieves better performance on basic computer vision tasks. On the other hand, creating sparsity structures (e.g., through pruning methods) achieves a more slim network structure. Inspired by modularity structures in the human brain, we bridge these two trends by proposing a new network structure with internally dense yet externally sparse connections. Experimental results demonstrate that our new structure could obtain competitive performance on benchmark tasks (CIFAR10, CIFAR100. and ImageNet) while keeping the network structure slim.
机译:近年来,深度卷积神经网络(CNN)出现了两个看似相反的发展。一方面,增加CNN的密度(例如,通过添加跨层连接)可在基本的计算机视觉任务上实现更好的性能。另一方面,创建稀疏结构(例如,通过修剪方法)实现了更苗条的网络结构。受人脑模块化结构的启发,我们通过提出一种具有内部密集但外部稀疏连接的新网络结构来桥接这两个趋势。实验结果表明,我们的新结构可以在基准任务(CIFAR10,CIFAR100和ImageNet)上获得竞争性能,同时保持网络结构的苗条。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号