ABSTRACT

This chapter is concerned with the problem of detecting and estimating the time delay between signals received at two spatially separated sensors together with noise. This problem has been of interest to many investigators since it has applications in varied fields. With the deterioration in signal spectra to noise spectra, the time delay estimates become increasingly assigned to false peaks. The chapter introduces well-known criteria in the linear optimization theory to provide optimal estimates of time delay between two signals imbedded in noise. When a digital implementation of the cross-correlator is used, the usual precautions must be taken in regard to aliasing. E. J. Hannan and P. J. Thomson have derived a maximum likelihood estimator of the time delay under general conditions. When a digital implementation of the cross-correlator is used, the usual precautions must be taken in regard to aliasing.