首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Efficient Dual Approach to Distance Metric Learning
【24h】

Efficient Dual Approach to Distance Metric Learning

机译:远程度量学习的有效对偶方法

获取原文
获取原文并翻译 | 示例
           

摘要

Distance metric learning is of fundamental interest in machine learning because the employed distance metric can significantly affect the performance of many learning methods. Quadratic Mahalanobis metric learning is a popular approach to the problem, but typically requires solving a semidefinite programming (SDP) problem, which is computationally expensive. The worst case complexity of solving an SDP problem involving a matrix variable of size $Dtimes D$ with $O (D)$ linear constraints is about $O(D^{6.5})$ using interior-point methods, where $D$ is the dimension of the input data. Thus, the interior-point methods only practically solve problems exhibiting less than a few thousand variables. Because the number of variables is $D (D+1)/2$, this implies a limit upon the size of problem that can practically be solved around a few hundred dimensions. The complexity of the popular quadratic Mahalanobis metric learning approach thus limits the size of problem to which metric learning can be applied. Here, we propose a significantly more efficient and scalable approach to the metric learning problem based on the Lagrange dual formulation of the problem. The proposed formulation is much simpler to implement, and therefore allows much larger Mahalanobis metric learning problems to be solved. The time complexity of the proposed method is roughly $O (D^{3})$, which is significantly lower than that of the SDP approach. Experiments on a variety of data sets demonstrate that the proposed method achieves an accuracy comparable with the state of the art, but is applicable to significantly - arger problems. We also show that the proposed method can be applied to solve more general Frobenius norm regularized SDP problems approximately.
机译:距离度量学习是机器学习中最基本的兴趣,因为采用的距离度量会显着影响许多学习方法的性能。二次马哈拉诺比斯度量学习是解决该问题的一种流行方法,但是通常需要解决半定规划(SDP)问题,这在计算上是昂贵的。使用内点方法解决涉及尺寸为$ Dtimes D $的矩阵变量且具有$ O(D)$线性约束的SDP问题的最坏情况复杂度约为$ O(D ^ {6.5})$,其中$ D $是输入数据的维度。因此,内点法实际上只能解决显示少于几千个变量的问题。因为变量的数量是$ D(D + 1)/ 2 $,所以这意味着对问题大小的限制,实际上可以在几百个维度上解决。因此,流行的二次Mahalanobis度量学习方法的复杂性限制了可以应用度量学习的问题的大小。在这里,我们基于度量的问题的拉格朗日对偶表示法,为度量学习问题提出了一种更为有效和可扩展的方法。所提出的公式易于实施,因此可以解决更大的马氏距离学习问题。所提出的方法的时间复杂度大约为$ O(D ^ {3})$,这大大低于SDP方法的时间复杂度。在各种数据集上进行的实验表明,所提出的方法可达到与现有技术相当的精度,但适用于明显的Arger问题。我们还表明,所提出的方法可以应用于近似解决更一般的Frobenius范数正则化SDP问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号