When L,-norm support vector machine and Z,2-norm support vector machine are used to analyse the datasets with small sample, hij^i dimension and high correlation in parts of the variables, the effects of them are not satisfactory. Taking the good advantages of the two methods, an improvement algorithm of doubly regularized support vector machine is proposed. But the inequality constraints and the non-differentiable norm bring many troubles. A positive function and a quadratic polynomial loss function are introduced to change the optimization problem into a differentiable and unconditional constraints one which is easy to compute using many optimization algorithms. Experimental results show the improvement gains better effects.%针对L1范数支持向量机和L2范数支持向量机在分析部分小样本、高维数、变量高相关的数据时效果不理想的问题,在综合利用这2种支持向量机优点的基础上,提出一种双重正则化支持向量机的改进算法.通过正号函数和二次多项式损失函数将问题转化为可微的无条件约束优化问题,便于采用多种优化算法进行运算.实验结果证明,该改进算法可取得较好的分类准确率.
展开▼