J. M. Walsh, P. A. Regalia, and C. R. Johnson, Jr.
In this paper we show that the iterative decoding algorithm can be viewed as descending a nonlinear least squares cost function. When at least one component code has a likelihood function that can be written as the product of its bitwise marginals, the iterative decoding algorithm is exactly a steepest descent on this cost function. Furthermore, when the iterative decoder converges to infinite log likelihood ratios, we show that its trajectories must locally descend the cost function. Conditions are then given under which the iterative decoder may be thought of as globally descending this cost function. This suggests, together with its positive definiteness, that the proposed cost function makes a suitable Lyapunov function.