首页> 外文会议>Annual Conference on Learning Theory >Reducing Kernel Matrix Diagonal Dominance Using Semi-definite Programming
【24h】

Reducing Kernel Matrix Diagonal Dominance Using Semi-definite Programming

机译:使用半确定编程减少内核矩阵对角线优势

获取原文

摘要

Kernel-based learning methods revolve around the notion of a kernel or Gram matrix between data points. These square, symmetric, positive semi-definite matrices can informally be regarded as encoding pairwise similarity between all of the objects in a data-set. In this paper we propose an algorithm for manipulating the diagonal entries of a kernel matrix using semi-definite programming. Kernel matrix diagonal dominance reduction attempts to deal with the problem of learning with almost orthogonal features, a phenomenon commonplace in kernel matrices derived from string kernels or Gaussian kernels with small width parameter. We show how this task can be formulated as a semi-definite programming optimization problem that can be solved with readily available optimizers. Theoretically we provide an analysis using Rademacher based bounds to provide an alternative motivation for the 1-norm SVM motivated from kernel diagonal reduction. We assess the performance of the algorithm on standard data sets with encouraging results in terms of approximation and prediction.
机译:基于内核的学习方法围绕数据点之间内核或克矩阵的概念旋转。这些正方形,对称的正半定矩阵可以非正式地被视为在数据集中的所有对象之间编码成对相似性。在本文中,我们提出了一种使用半定编程来操纵核矩阵的对角线条目的算法。内核矩阵对角线的优势消除尝试处理具有几乎正交的特征的学习问题,在具有小宽度参数的串核或高斯核中导出的内核矩阵中常见的现象。我们展示了如何将此任务标准为半定编程优化问题,可以使用易于使用的优化器来解决。理论上,我们提供了使用Rademacher基于Radimacher的界限来提供的分析,以提供来自核心对角线减少的1-NARM SVM的替代动机。我们在近似和预测方面评估算法对标准数据集的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号