ABSTRACT

Although neural networks have the intrinsic property of parallel operations, the traditional computers cannot fully exploit it because of serial hardware. By using analog circuit design techniques, large amount of parallel functional units can be accommodated in a small silicon area, and at the same time, the precision requirement for neural operations can be achieved. A synapse circuit which can perform four-quadrant multiplication has been designed. Dynamically refreshed weight value storage provides programmable capability. An input-neuron circuit which performs as a fast buffer and an output-neuron circuit with gain-adjustable capability are also designed. A prototype system using the designed components has been successfully trained by the Generalized Delta Rule.