ABSTRACT

This chapter outlines the performance metrics commonly used by engineers to specify analog-to-digital conversion (ADC) requirements. An analog-to-digital converter (ADC) converts an analog signal into discrete digital numbers. In a sensor application, the ADC interfaces a front-end processor to an IF (intermediate frequency) circuit and converts the IF signal for digital processing. ADC characteristics (e.g., dynamic range, sampling rate, etc.) often determine the application design and performance. This conceptual ADC has three bits of “resolution” (binary counter output Q2, Q1, and Q0). In practice, a sample-and-hold (SAH) circuit prevents the ADC input, a time-varying analog voltage x(t), from changing during the conversion time. The offset error of an ADC, which is similar to the offset error of an amplifier, is defined as a deviation of the ADC output code transition points that is present across all output codes. This error has the effect of shifting, or translating, the ADC’s actual transfer function away from the ideal transfer function.