Cornell University Home Page

MS Thesis, Cornell University, May 2001

Exploting Sparsity in Adaptive Filters

Richard K. Martin

Abstract

This thesis studies a class of algorithms called Natural Gradient (NG) algorithms, and their approximations, known as ANG algorithms. The LMS algorithm is derived within the NG framework, and a family of LMS variants that exploit sparsity is derived. This procedure is repeated for other algorithms and their families, namely the Constant Modulus Algorithm (CMA), Decision-Directed LMS (DD-LMS), and algorithms using an ARMA parameterization.

Mean squared error (MSE) analysis, stability analysis, and convergence analysis of the family of sparse LMS algorithms are provided, and it is shown that if the system is sparse, then the new algorithms will often converge faster for a given total asymptotic MSE, although the choice of initialization is important. Simulations are provided to confirm the analysis. In addition, Bayesian priors matching the statistics of a database of real channels are given, and algorithms are derived that exploit these priors. Simulations using measured channels are used to show a realistic application of these algorithms.