Siwakon SuppalapRabian Wangkeeree
The support vector machine (SVM) with pinball loss (Pin-SVM) can handle noise sensitivity and instability to re-sampling but loses sparsity. To solve this limitation, SVM with a generalized pinball loss that incorporates an insensitive zone (GP-SVM) was proposed. The GP-SVM can handle sparsity by optimizing the asymmetric spread of the insensitive zone. Despite these improvements, the unboundedness of the loss functions can result in a lack of robustness to outliers. In this paper, we introduce a novel robust support vector classification based on an $(\alpha _{1}, \alpha _{2})$ -asymmetric bounded loss function, an asymmetric truncated generalized pinball loss (called $L_{tgp}^{\alpha _{1}, \alpha _{2}}$ ). A characteristic of SVM with $L_{tgp}^{\alpha _{1}, \alpha _{2}}$ (ATGP-SVM) is its ability to balance generalization and sparsity while minimizing the impact of outliers. However, $L_{tgp}^{\alpha _{1}, \alpha _{2}}$ is a non-convex function, ATGP-SVM is difficult to solve. Therefore, we formulated the ATGP-SVM utilizing DC (difference of convex functions) programming and subsequently resolved it through the DC algorithm (DCA). The experimental results obtained from diverse benchmark datasets underscore the effectiveness of our proposed formulation when compared to other state-of-the-art classification models.
Xin ShenLingfeng NiuZhiquan QiYingjie Tian
Barenya Bikash HazarikaDeepak GuptaParashjyoti Borah
Huiru WangYitian XuZhijian Zhou
Xian ShanZheshuo ZhangXiaoying LiYu XieJinyu You