Tony Van GestelJohan A. K. SuykensJos De BrabanterBart De MoorJoos Vandewalle
Support vector machine (SVM) classifiers aim at constructing a large margin classifier in the feature space, while a nonlinear decision boundary is obtained in the input space by mapping the inputs in a nonlinear way to a possibly infinite dimensional feature space. Mercer's condition is applied to avoid an explicit expression for the nonlinear mapping and the solution follows from a finite dimensional quadratic programming problem. Recently, other classifier formulations related to a regularized form of Fisher discriminant analysis have been proposed in the feature space for which practical expressions are obtained in a second step by applying the Mercer condition. In this paper, we relate these techniques to least squares SVM, for which the solution follows from a linear Karush-Kuhn-Tucker system in the dual space. Based on the link with empirical linear discriminant analysis one can adjust the bias term in order to take prior information on the class distributions into account and to analyze unbalanced training sets.
Sheng ZhengYuqiu SunJinwen TianJain Liu
Changha HwangSang‐Il ChoiJooyong Shim
Jooyong ShimJong-Sig BaeChangha Hwang
Weiwu YanMingguang ZhangChunkai ZhangShao Hui-he