ABSTRACT

This chapter presents a statistical approach based on the concept of output deviations to select the most effective test patterns for small-delay defect detection. The literature indicates that, for academic benchmark circuits, there is high correlation between the output deviations measure for test patterns and the sensitization of long paths under process variation. These benchmark circuits report significant reductions in CPU run time and pattern count and higher test quality compared with commercial timing-aware automatic test pattern generation tools. The chapter examines the applicability or effectiveness of this approach for realistic industrial circuits. It presents the adaptation of the output deviation metric to industrial circuits. The chapter reviews the concept of output deviations, and then describes how we extended the deviation-based pattern selection method of earlier work to industrial circuits. It only focus on variations' impact on gate delay due to the lack of layout information.