ABSTRACT

DSP is currently a major market force, consisting of semiconductor, hardware, software, methodology, application, and training sectors. The origins of DSP are open to debate, but a seminal moment surely occurred when Claude Shannon developed an understanding of sample-data signal processing in the middle of the 20th century. Shannon’s sampling theorem states that if an analog signal, having a highest frequency bounded by

B

Hz, is sampled at a rate in excess of

f

>

B

Hz, then it can be perfectly reconstructed (exactly) from its sample values. The critical parameter

f

=

f

/2 is called the

Nyquist frequency

and represents a strict upper frequency bound on the highest baseband frequency allowed in the sampled signal (i.e.,

B

). Most DSP solutions are over-sampled, operating at a sample frequency far in excess of its minimally required value. If a signal is under-sampled at a rate below the minimum rate of 2

B

Hz,

aliasing

errors can occur. An aliased signal is a baseband signal whose sample values impersonate those of a signal having

frequency components in excess

B

Hz. Another early enabler of the DSP revolution was the Cooley-Tukey fast Fourier transform (FFT) algorithm. The FFT made many signal-processing tasks practical for the first time using, in many instances, only software. Another defining DSP moment occurred when the first DSP microprocessors (DSP

µ

p) made a marketplace appearance beginning in the late 1970s. These devices provided an affordable and tangible means of developing hardware and embedded solutions with a minimum risk and effort. Regardless of the origins, today’s DSP objects and systems have become part of a pervasive technology, appearing in a myriad of applications, and supported with a rich and deep technological infrastructure. DSP is now a discipline unto itself, with its own professional societies, academic programs, trained practitioners, and industrial infrastructure.