Abstract

The Normalized Least Mean Square (NLMS) algorithm belongs to gradient class of adaptive algorithm which provides the solution to the slow convergence of the Least Mean Square (LMS) algorithm. Motivated by the recently explored q-gradient in the field of adaptive filtering, we developed here a q-gradient based NLMS algorithm. More specifically, we replace the conventional gradient by the q-gradient to derive the NLMS weight update recursion. We also provide a detailed mean-square-error (MSE) analysis of the proposed algorithm for both the transient and the steady-state scenarios. Consequently, we derive the closed form expressions for the MSE learning curve and the steady-state excess MSE (EMSE). Simulation results are provided to show the superiority of the proposed algorithm over the conventional NLMS algorithm and to validate the theoretical analysis.

Keywords:
Algorithm Computer science Mathematics

Metrics

12
Cited By
2.41
FWCI (Field Weighted Citation Impact)
15
Refs
0.89
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Adaptive Filtering Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Blind Source Separation Techniques
Physical Sciences →  Computer Science →  Signal Processing
Image and Signal Denoising Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Interference-Normalized Least Mean Square Algorithm

Jean-Marc ValinIain B. Collings

Journal:   IEEE Signal Processing Letters Year: 2007 Vol: 14 (12)Pages: 988-991
JOURNAL ARTICLE

A New Partial-normalized Least Mean Square Algorithm

Jirasak TanpreeyachayaIchi TakumiMasayasu Hata

Journal:   IEEJ Transactions on Electronics Information and Systems Year: 1996 Vol: 116 (1)Pages: 57-65
© 2026 ScienceGate Book Chapters — All rights reserved.