首页> 外文期刊>IEEE Transactions on Information Theory >Finite-Dimensional Projection for Classification and Statistical Learning
【24h】

Finite-Dimensional Projection for Classification and Statistical Learning

机译:分类和统计学习的有限维投影

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, a new method for the binary classification problem is studied. It relies on empirical minimization of the hinge risk over an increasing sequence of finite-dimensional spaces. A suitable dimension is picked by minimizing the regularized risk, where the regularization term is proportional to the dimension. An oracle-type inequality is established for the excess generalization risk (i.e., regret to Bayes) of the procedure, which ensures adequate convergence properties of the method. We suggest to select the considered sequence of subspaces by applying kernel principal components analysis (KPCA). In this case, the asymptotical convergence rate of the method can be better than what is known for the support vector machine (SVM). Exemplary experiments are presented on benchmark data sets where the practical results of the method are comparable to the SVM.
机译:本文研究了一种二元分类问题的新方法。它依赖于在有限维空间增加序列上的铰链风险的经验最小化。通过最小化正则化风险来选择合适的维度,其中正则化项与维度成比例。针对该过程的过度泛化风险(即,对贝叶斯感到遗憾)建立了预言型不等式,这确保了该方法的足够收敛性。我们建议通过应用内核主成分分析(KPCA)选择考虑的子空间序列。在这种情况下,该方法的渐近收敛速度可能会好于支持向量机(SVM)的已知收敛速度。在基准数据集上进行了示例性实验,该方法的实际结果与SVM相当。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号