JOURNAL ARTICLE

Stochastic Gradient Descent for Kernel-Based Maximum Correntropy Criterion

T. LiBaobin WangChaoquan PengHong Yin

Year: 2024 Journal:   Entropy Vol: 26 (12)Pages: 1104-1104   Publisher: Multidisciplinary Digital Publishing Institute

Abstract

Maximum correntropy criterion (MCC) has been an important method in machine learning and signal processing communities since it was successfully applied in various non-Gaussian noise scenarios. In comparison with the classical least squares method (LS), which takes only the second-order moment of models into consideration and belongs to the convex optimization problem, MCC captures the high-order information of models that play crucial roles in robust learning, which is usually accompanied by solving the non-convexity optimization problems. As we know, the theoretical research on convex optimizations has made significant achievements, while theoretical understandings of non-convex optimization are still far from mature. Motivated by the popularity of the stochastic gradient descent (SGD) for solving nonconvex problems, this paper considers SGD applied to the kernel version of MCC, which has been shown to be robust to outliers and non-Gaussian data in nonlinear structure models. As the existing theoretical results for the SGD algorithm applied to the kernel MCC are not well established, we present the rigorous analysis for the convergence behaviors and provide explicit convergence rates under some standard conditions. Our work can fill the gap between optimization process and convergence during the iterations: the iterates need to converge to the global minimizer while the obtained estimator cannot ensure the global optimality in the learning process.

Keywords:
Mathematical optimization Stochastic gradient descent Kernel (algebra) Outlier Convergence (economics) Computer science Convex optimization Convexity Smoothness Optimization problem Stochastic optimization Estimator Mathematics Iterated function Gaussian process Gaussian Artificial intelligence Regular polygon Artificial neural network

Metrics

2
Cited By
1.44
FWCI (Field Weighted Citation Impact)
49
Refs
0.72
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Adaptive Filtering Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Blind Source Separation Techniques
Physical Sciences →  Computer Science →  Signal Processing

Related Documents

JOURNAL ARTICLE

Online Gradient Descent for Kernel-Based Maximum Correntropy Criterion

Baobin WangTing Hu

Journal:   Entropy Year: 2019 Vol: 21 (7)Pages: 644-644
JOURNAL ARTICLE

Kernel-based maximum correntropy criterion with gradient descent method

Ting Hu

Journal:   Communications on Pure &amp Applied Analysis Year: 2020 Vol: 19 (8)Pages: 4159-4177
JOURNAL ARTICLE

Robust Ellipse Fitting With Laplacian Kernel Based Maximum Correntropy Criterion

Chenlong HuGang WangK. C. HoJunli Liang

Journal:   IEEE Transactions on Image Processing Year: 2021 Vol: 30 Pages: 3127-3141
© 2026 ScienceGate Book Chapters — All rights reserved.