...
首页> 外文期刊>IEEE Transactions on Information Theory >Low-Tubal-Rank Tensor Completion Using Alternating Minimization
【24h】

Low-Tubal-Rank Tensor Completion Using Alternating Minimization

机译:使用交替最小化的低管级张量完成

获取原文
获取原文并翻译 | 示例
           

摘要

The low-tubal-rank tensor model has been recently proposed for real-world multidimensional data. In this paper, we study the low-tubal-rank tensor completion problem, i.e., to recover a third-order tensor by observing a subset of its elements selected uniformly at random. We propose a fast iterative algorithm, called Tubal-AltMin, that is inspired by a similar approach for low-rank matrix completion. The unknown low-tubal-rank tensor is represented as the product of two much smaller tensors with the low-tubal-rank property being automatically incorporated, and Tubal-AltMin alternates between estimating those two tensors using tensor least squares minimization. First, we note that tensor least squares minimization is different from its matrix counterpart and nontrivial as the circular convolution operator of the low-tubal-rank tensor model is intertwined with the sub-sampling operator. Secondly, the theoretical performance guarantee is challenging since Tubal-AltMin is iterative and nonconvex. We prove that 1) Tubal-AltMin generates a best rank-r approximate up to any predefined accuracy epsilon at an exponential rate, and 2) for an n x n x k tensor M with tubal-rank r n, the required sampling complexity is O((nr(2)k parallel to M parallel to(2)(F) log(3) n)/(sigma) over bar (2)(rk)), where (sigma) over bar (rk) is the rk-th singular value of the block diagonal matrix representation of M in the frequency domain, and the computational complexity is O(n(2)r(2)k(3) log n log(n/epsilon)). Finally, on both synthetic data and real-world video data, evaluation results show that compared with tensor-nuclear norm minimization using alternating direction method of multipliers (TNN-ADMM), Tubal-AltMin-Simple (a simplified implementation of Tubal-AltMin) improves the recovery error by several orders of magnitude. In experiments, Tubal-AltMin-Simple is faster than TNN-ADMM by a factor of 5 for a 200 x 200 x 20 tensor.
机译:最近已针对现实世界的多维数据提出了低管形张量模型。在本文中,我们研究了低管形张量完成问题,即通过观察随机选择的均匀元素子集来恢复三阶张量。我们提出了一种称为Tubal-AltMin的快速迭代算法,该算法受类似的低秩矩阵完成方法的启发。未知的低管形张量表示为两个小得多的张量的乘积,低管形性质被自动合并,并且Tubal-AltMin在使用张量最小二乘最小化估计这两个张量之间交替。首先,我们注意到,张量最小二乘最小化与其矩阵对应项不同,并且是非平凡的,因为低管状秩张量模型的圆卷积算子与子采样算子交织在一起。其次,由于Tubal-AltMin是迭代且非凸的,因此理论上的性能保证具有挑战性。我们证明1)Tubal-AltMin以指数速率生成最接近任何预定义精度epsilon的最佳rank-r,2)对于肾小管秩r n的nxnxk张量M,所需的采样复杂度为O( (nr(2)k平行于M平行于(2)(rk)之上的(2)(F)log(3)n)/(sigma)),其中bar(rk)上的(sigma)是rk- M的块对角矩阵表示的奇异值在频域中,计算复杂度为O(n(2)r(2)k(3)log n log(n / epsilon))。最后,在合成数据和真实视频数据上,评估结果表明,与使用交替方向乘数法(TNN-ADMM)的张量-核范数最小化相比,Tubal-AltMin-Simple(Tubal-AltMin的简化实现)将恢复误差提高了几个数量级。在实验中,对于200 x 200 x 20张量,Tubal-AltMin-Simple比TNN-ADMM快5倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号