首页> 外文会议>Neural Information Processing pt.1; Lecture Notes in Computer Science; 4232 >A Novel Sequential Minimal Optimization Algorithm for Support Vector Regression
【24h】

A Novel Sequential Minimal Optimization Algorithm for Support Vector Regression

机译:支持向量回归的新型序贯最小优化算法

获取原文
获取原文并翻译 | 示例

摘要

A novel sequential minimal optimization (SMO) algorithm for support vector regression is proposed. This algorithm is based on Flake and Lawrence's SMO in which convex optimization problems with l variables are solved instead of standard quadratic programming problems with 2l variables where l is the number of training samples, but the strategy for working set selection is quite different. Experimental results show that the proposed algorithm is much faster than Flake and Lawrence's SMO and comparable to the fastest conventional SMO.
机译:提出了一种用于支持向量回归的新型序贯最小优化(SMO)算法。该算法基于Flake和Lawrence的SMO,其中解决了带有l个变量的凸优化问题,而不是解决了带有2l个变量的标准二次规划问题,其中l是训练样本的数量,但是工作集选择的策略却大不相同。实验结果表明,该算法比Flake和Lawrence的SMO快得多,并且可与最快的常规SMO相提并论。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号