ABSTRACT

Our work is concerned with the gestures that occur when people speak, and the possible role of cerebral laterality in coordinating gestures with speech. Gesture is not language, but is part of a system of speech and language. Unlike sign language, gesture has little syntax and no standards of well-formedness, and is not socially regulated; these are properties of the accompanying speech. Indeed, as noted elsewhere in this volume by Singleton, Goldin-Meadow, and McNeill, when gestures must bear the full burden of communication, qualitative changes take place that move the gestures in the direction of language (see also Bloom, 1979; Dufour, 1992). Gestures with speech are nonredundant co-expressive displays of meaning, and comprise symbols of a different type to speech altogether. Thus, to consider gestures together with speech, is to consider two types of symbol that occupy the same moment of expression. This kind of binocular vision leads to new insights into the nature of the language system itself—insights that may bear equally on spoken and signed languages.