首页> 中文期刊> 《软件学报》 >无偏置支持向量回归优化问题

无偏置支持向量回归优化问题

         

摘要

为了研究偏置对支持向量回归(support vector regression,简称SVR)问题泛化性能的影响,首先提出了无偏置SVR(NBSVR)的优化问题及其对偶问题.推导出了NBSVR优化问题全局最优解的必要条件,然后证明了SVR的对偶问题只能得到NBSVR对偶问题的次优解.同时提出了NBSVR的有效集求解算法,并证明了它是线性收敛的.基于21个标准数据集的实验结果表明,在对偶问题解空间上,有偏置支持向量回归算法只能得到无偏置支持向量回归算法的次优解,NBSVR的均方根误差要低于SVR.NBSVR的训练时间不仅低于SVR,而且对核参数变化不太敏感.%To study the role of bias in support vector regression (SVR), primal and dual optimization formulations of support vector regression optimization problem without bias (NBSVR) are proposed first, and the necessary condition of NBSVR optimization formulation's global optima is presented and sub-optima solution of NBSVR dual problem has been proved for the dual problem of SVR then. An active set algorithm of dual optimization formulation without bias is proposed, and the linear convergence of the proposed algorithm has been proved. The experimental results on 21 benchmark datasets show that in the solution space of dual problem, SVR can only obtain the sub-optimal solution of NBSVR, the root mean square error (RMSE) of NBSVR tends to lower than SVR. The training time of NBSVR is not only less than SVR, but also less sensitive to kernel parameter.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号