Guolong SuJian JinYuantao GuJian Wang
As one of the recently proposed algorithms for sparse system identification,\n$l_0$ norm constraint Least Mean Square ($l_0$-LMS) algorithm modifies the cost\nfunction of the traditional method with a penalty of tap-weight sparsity. The\nperformance of $l_0$-LMS is quite attractive compared with its various\nprecursors. However, there has been no detailed study of its performance. This\npaper presents all-around and throughout theoretical performance analysis of\n$l_0$-LMS for white Gaussian input data based on some reasonable assumptions.\nExpressions for steady-state mean square deviation (MSD) are derived and\ndiscussed with respect to algorithm parameters and system sparsity. The\nparameter selection rule is established for achieving the best performance.\nApproximated with Taylor series, the instantaneous behavior is also derived. In\naddition, the relationship between $l_0$-LMS and some previous arts and the\nsufficient conditions for $l_0$-LMS to accelerate convergence are set up.\nFinally, all of the theoretical results are compared with simulations and are\nshown to agree well in a large range of parameter setting.\n
Resende ResendeLeonardo Souza SiqueiraNewton HaddadMariane R. Petraglia
Diego B. HaddadLívia O. SantosLuciana Faletti AlmeidaGabriel SantosMariane R. Petraglia
Byomakesh DashSatish ChoudhuryBidyadhar SubudhiRenu Sharma