The Normalized Least Mean Square (NLMS) algorithm belongs to gradient class of adaptive algorithm which provides the solution to the slow convergence of the Least Mean Square (LMS) algorithm. Motivated by the recently explored q-gradient in the field of adaptive filtering, we developed here a q-gradient based NLMS algorithm. More specifically, we replace the conventional gradient by the q-gradient to derive the NLMS weight update recursion. We also provide a detailed mean-square-error (MSE) analysis of the proposed algorithm for both the transient and the steady-state scenarios. Consequently, we derive the closed form expressions for the MSE learning curve and the steady-state excess MSE (EMSE). Simulation results are provided to show the superiority of the proposed algorithm over the conventional NLMS algorithm and to validate the theoretical analysis.
Jean-Marc ValinIain B. Collings
Jirasak TanpreeyachayaIchi TakumiMasayasu Hata
Hamed ModagheghHossein Khosravi R.Saeed Ahoon ManeshHadi Sadoghi Yazdi
P.N.S. TejasriK. AnushaK. Sangeet KumarNukella VenkateshY. Yamini Devi