首页> 外文会议>International Conference on Machine Learning >Online Kernel Learning with a Near Optimal Sparsity Bound
【24h】

Online Kernel Learning with a Near Optimal Sparsity Bound

机译:在线内核学习与近最佳稀疏性约束

获取原文

摘要

In this work, we focus on Online Sparse Kernel Learning that aims to online learn a kernel classifier with a bounded number of support vectors. Although many online learning algorithms have been proposed to learn a sparse kernel classifier, most of them fail to bound the number of support vectors used by the final solution which is the average of the intermediate kernel classifiers generated by online algorithms. The key idea of the proposed algorithm is to measure the difficulty in correctly classifying a training example by the derivative of a smooth loss function, and give a more chance to a difficult example to be a support vector than an easy one via a sampling scheme. Our analysis shows that when the loss function is smooth, the proposed algorithm yields similar performance guarantee as the standard online learning algorithm but with a near optimal number of support vectors (up to a poly(lnT) factor). Our empirical study shows promising performance of the proposed algorithm compared to the state-of-the-art algorithms for online sparse kernel learning.
机译:在这项工作中,我们专注于在线稀疏内核学习,旨在在线学习内核分类器,其中包含有界数量的支持向量。虽然已经提出了许多在线学习算法来学习稀疏的内核分类器,但大多数都无法绑定最终解决方案所使用的支持向量的数量,这是在线算法生成的中间内核分类器的平均值。所提出的算法的关键思想是测量通过平滑丢失函数的导数正确对训练示例进行正确分类训练示例,并且通过采样方案给出难以容易地成为支持向量的难度示例的机会。我们的分析表明,当损耗函数平滑时,所提出的算法在标准在线学习算法中产生类似的性能保证,但具有近最佳的支持向量(直到多(LNT)因子)。我们的实证研究表明,与在线稀疏内核学习的最先进的算法相比,所提出的算法的有希望的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号