首页> 外文会议>IEEE Conference on Computer Vision and Pattern Recognition Workshops >Tensor Contraction Layers for Parsimonious Deep Nets
【24h】

Tensor Contraction Layers for Parsimonious Deep Nets

机译:张解缩小层,用于宽松的深网

获取原文

摘要

Tensors offer a natural representation for many kinds of data frequently encountered in machine learning. Images, for example, are naturally represented as third order tensors, where the modes correspond to height, width, and channels. In particular, tensor decompositions are noted for their ability to discover multi-dimensional dependencies and produce compact low-rank approximations of data. In this paper, we explore the use of tensor contractions as neural network layers and investigate several ways to apply them to activation tensors. Specifically, we propose the Tensor Contraction Layer (TCL), the first attempt to incorporate tensor contractions as end-to-end trainable neural network layers. Applied to existing networks, TCLs reduce the dimensionality of the activation tensors and thus the number of model parameters. We evaluate the TCL on the task of image recognition, augmenting popular networks (AlexNet, VGG). The resulting models are trainable end-to-end. We evaluate TCL's performance on the task of image recognition, using the CIFAR100 and ImageNet datasets, studying the effect of parameter reduction via tensor contraction on performance. We demonstrate significant model compression without significant impact on the accuracy and, in some cases, improved performance.
机译:张多斯在机器学习中经常遇到的多种数据提供自然代表性。例如,图像自然表示为三阶张量,其中模式对应于高度,宽度和通道。特别地,注意到它们发现多维依赖性的能力并产生紧凑的低秩近似数据的能力。在本文中,我们探讨了张量凹陷作为神经网络层的使用,并调查若干方法以将它们应用于激活张量。具体地,我们提出了张量收缩层(TCL),首先尝试将张量凹陷纳入端到端培训的神经网络层。应用于现有网络,TCLS降低激活张量的维度,从而减少了模型参数的数量。我们对图像识别的任务进行评估,增强流行网络(AlexNet,VGG)的任务。由此产生的模型是可训练的端到端。我们使用CIFAR100和ImageNet数据集来评估TCL对图像识别任务的性能,研究通过张量收缩对性能的参数降低的影响。我们展示了显着的模型压缩,而不会对准确性产生重大影响,在某些情况下提高性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号