ABSTRACT

This chapter provides an overview of application-specific integrated circuit (ASIC) technology. An integrated circuit (IC) is a tiny semiconductor chip on which a complex of electronic components (e.g., transistors) and their interconnections are fabricated with a set of pattern-defining masks. The concept of integrating multiple electronic devices and their interconnections entirely on a small semiconductor chip was proposed shortly after the transistor was invented in 1948. DRAM (dynamic random access memory) devices have been the driver of technology scaling. The smallest half-pitch of contacted metal lines allowed in a DRAM fabrication process is used to define the technology node which is treated as a single, simple indicator of overall industry progress in IC feature scaling. Silicon, germanium, and carbon, all in column 14 of the periodic table, are elemental semiconductors. The relative ease of devising an economically manufacturable technology has led to the almost universal adoption of silicon as the primary semiconductor for the electronics industry.