无偏置-SVM分类优化问题研究
doi: 10.3724/SP.J.1146.2010.01286
Study on -SVM for Classification Optimization Problem without Bias
-
摘要: 在高维空间中,分类超平面倾向于通过原点,即不需要偏置(b)。为了研究在-SVM分类问题中是否需要b,该文提出了无(b)的-SVM的对偶优化问题并给出了其优化问题求解方法。该方法通过有效集策略将对偶优化问题转化为等式约束子优化问题,然后通过拉格朗日乘子法将子优化问题转化为线程方程组来求解。实验表明偏置(b)的存在会降低-SVM的泛化性能,-SVM只能得到无(b) -SVM的次优解。Abstract: In the high-dimensional space, the classification hyperplane tends to pass through the origin and bias (b) is not need. To study whether -SVM for classification needs (b), dual optimization formulation of-SVM without (b) is proposed and the corresponding method of solving the optimization formulation is presented. The dual optimization formulation is transformed into equality constraint sub-optimization formulation by the active set strategy in this method, then the sub-optimization formulation is transformed into the linear equation by lagrange multiplier method. The experimental results show that the existence of (b) would reduce the generalization ability of-SVM and-SVM can only obtain the sub-optimal solution of-SVM without b.
-
Key words:
- -Support Vector Machine (SVM) /
- Bias /
- Generalization ability /
- Active set
计量
- 文章访问数: 2899
- HTML全文浏览量: 90
- PDF下载量: 696
- 被引次数: 0