...
首页> 外文期刊>IEEE Transactions on Neural Networks >Global Convergence of SMO Algorithm for Support Vector Regression
【24h】

Global Convergence of SMO Algorithm for Support Vector Regression

机译:支持向量回归的SMO算法的全局收敛性

获取原文
获取原文并翻译 | 示例
           

摘要

Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given $l$ training samples, SVR is formulated as a convex quadratic programming (QP) problem with $l$ pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for update in each step and subproblems are solved in a certain way, then the SMO algorithm always stops within a finite number of iterations after finding an optimal solution. Also, efficient implementation techniques for the SMO algorithm are presented and compared experimentally with other SMO algorithms.
机译:本文研究了用于支持向量回归(SVR)的序列最小优化(SMO)算法的全局收敛性。给定$ l $训练样本,SVR被公式化为带有$ l $对变量的凸二次规划(QP)问题。我们证明,如果在每个步骤中选择了两对违反最优性条件的变量进行更新,并且以某种方式解决了子问题,那么SMO算法在找到最优解后总是在有限的迭代次数内停止。此外,提出了SMO算法的有效实现技术,并将其与其他SMO算法进行了实验比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号