R.K. Martin, W.A. Sethares, R.C. Williamson and C.R. Johnson, Jr.
This paper studies a class of algorithms called Natural Gradient (NG) algorithms and their approximations, known as ANG algorithms. The LMS algorithm is derived within the NG framework, and a family of LMS variants that exploit sparsity is derived. Mean squared error (MSE) analysis of the family of ANG algorithms is provided, and it is shown that if the system is sparse, then the new algorithms will often converge faster for a given total asymptotic MSE. Simulations are provided to confirm the analysis. In addition, Bayesian priors matching the statistics of a database of real channels are given. Actual channels are identified by algorithms that exploit these priors and by LMS, showing a realistic application of these algorithms.