W. Chung, J. Balakrishnan , W.A. Sethares and C.R. Johnson, Jr.
This paper investigates the idea of applying ``dispersion minimization'' to the problem of timing phase recovery, and analyzes the resulting cost surface. The minimization can be readily accomplished via a gradient-style algorithm, and the behavior of the algorithm is analyzed in a variety of situations. In spite of its self-learning adaptation, expressions describing the error surface over which the algorithm evolves show that the error surface is unimodal for received signals which are band limited by the Nyquist frequency for baud and $T/2$ sampling, even in the presence of multipath interference. For the baud sampling of raised cosine pulse shaped signals, the algorithm tends to converge to a unique minimum, except in certain extreme cases. Analysis shows that this algorithm performs as desired in the absence of multipath with raised cosine pulse shaping.