ABSTRACT

Linguistic theory in the generative tradition is based on a small number of simple but important observations about human languages and how they are acquired. First, the structure of language is extremely complex—so complex that it is often argued that it would be impossible to learn without prior knowledge as to its general character (Chomsky, 1965). Second, children learn languages rapidly and seemingly effordessly. Although clearly limited with respect to other sorts of cognitive tasks, every normal child raised under normal circumstances learns the basic syntax of language within a few years of birth. Third, the world’s languages exhibit structural commonalities—so-called linguistic universals. Together, these observations have led many researchers to the conclusion that language involves domain-specific forms of knowledge that are largely innate. In the generative approach, the faculty of mind dedicated to language is called linguistic competence. A generative grammar is a formal description of this faculty, in the form of a system that generates the set of possible sentences of a given language, and thereby bestows on its possessor the ability to distinguish between grammatical and ungrammatical utterances. Grammars developed within this tradition (which we will call the standard approach) typically consist of primitives, operations, and principles intended to describe the knowledge of an idealized speaker-hearer in a homogeneous speech community. In this approach, cognitive representations are hierarchically structured sets of symbols, and cognitive processes are operations on them.