首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Efficient Kernel Sparse Coding Via First-Order Smooth Optimization
【24h】

Efficient Kernel Sparse Coding Via First-Order Smooth Optimization

机译:通过一阶平滑优化实现有效的内核稀疏编码

获取原文
获取原文并翻译 | 示例
           

摘要

We consider the problem of dictionary learning and sparse coding, where the task is to find a concise set of basis vectors that accurately represent the observation data with only small numbers of active bases. Typically formulated as an L1-regularized least-squares problem, the problem incurs computational difficulty originating from the nondifferentiable objective. Recent approaches to sparse coding thus have mainly focused on acceleration of the learning algorithm. In this paper, we propose an even more efficient and scalable sparse coding algorithm based on the first-order smooth optimization technique. The algorithm finds the theoretically guaranteed optimal sparse codes of the epsilon-approximate problem in a series of optimization subproblems, where each subproblem admits analytic solution, hence very fast and scalable with large-scale data. We further extend it to nonlinear sparse coding using kernel trick by showing that the representer theorem holds for the kernel sparse coding problem. This allows us to apply dual optimization, which essentially results in the same linear sparse coding problem in dual variables, highly beneficial compared with the existing methods that suffer from local minima and restricted forms of kernel function. The efficiency of our algorithms is demonstrated for natural stimuli data sets and several image classification problems.
机译:我们考虑字典学习和稀疏编码的问题,其中的任务是找到一组简洁的基向量,这些基向量仅用少量活动碱基就能准确表示观察数据。通常将其表述为L1正则化最小二乘问题,该问题会导致源自不可微目标的计算困难。因此,稀疏编码的最新方法主要集中在学习算法的加速上。本文提出了一种基于一阶平滑优化技术的高效,可扩展的稀疏编码算法。该算法在一系列优化子问题中找到了理论上保证的ε近似问题的最优稀疏代码,其中每个子问题都允许使用解析解,因此可以快速且可扩展地处理大规模数据。通过证明表示定理对内核稀疏编码问题成立,我们进一步将其扩展为使用内核技巧的非线性稀疏编码。这使我们能够应用对偶优化,它实质上在对偶变量中导致相同的线性稀疏编码问题,与遭受局部最小值和核函数形式受限的现有方法相比,这是非常有益的。对于自然刺激数据集和一些图像分类问题,证明了我们算法的效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号