ABSTRACT

Classically, evolutionary perspectives relating to the emergence of language suggest that language appeared in the context of a sudden and fortuitous genetic mutation tracing back to the very origin of the Homo sapiens sapiens period (Bickerton, 1995; Crow, 2002), including the view that modern humans are equipped with an innate universal grammar (Chomsky, 1975). However, recent findings have provided compelling evidence that language and action share many common features, giving rise to the idea that language has rather probably evolved within a large timescale in relation to growing motor repertoire and social interactions (Arbib, 2005; Corballis, 2009; Hewes, 1973; Pinker and Bloom, 1990; see also Corballis, this volume). According to this perspective, the evolution from a quadruped to a biped posture is thought to have provided more flexibility for upper-limb gesture and to have favoured the emergence of gestural communication (Corballis, 2002). The production of sounds accompanying gesture production may have led to the emergence of a vocal proto-language, which has evolved over the last 2.5 million years to become the language we are currently using today (Semaw et al., 1997). Thus, the production and comprehension of language is viewed as being fundamentally a motor process that appeared on the basis of gestural language (Jeannerod, 2006), leaving eventually arm and hand movements more available for object manipulation and tool conception (Corballis, 2009; see also Corballis, this volume). As a consequence, spontaneous gestures remain often associated with oral communication, in particular in children at the prelinguistic stage when expressing thoughts for which they have no words yet (Goldin-Meadow and McNeill, 1999; see also Gentilucci and Campione, this volume). In agreement with this progressive improvement of speech on the basis of gestural repertoire, lateralisation of language within the brain involves usually the left hemisphere, which is also the hemisphere that controls actions of the preferred right hand for 90 per cent of the population (Corballis, 2003; Knecht et al., 2000; Skoyles, 2000). Furthermore, since language is fundamentally a gestural system, it persists in this form in signed languages (Corballis, 2009) and the right hand is generally preferred for speech perception (Kimura, 1973). Finally, speech perception, either by listening to speech or by observing speech movements, activates motor structures involved in speech production (Fadiga et al., 2002; Kerzel and Bekkering, 2000). Several authors have thus proposed that activations within brain motor areas during language perception contribute to language comprehension (Andres et al., 2008; Fischer and Zwaan, 2008; Pulvermüller, 2005). To date, although many questions remain open, a great deal of evidence exists supporting the idea that language must be viewed as an embodied system (Barsalou, 2008), rather than as a system based essentially on amodal abstract symbols (Corballis, 2009; Fischer and Zwaan, 2008; see also Corballis, this volume). Theories of embodiment suggest a large overlap between the processes involved in perception and action and the processes involved in language processing (Barsalou, 2008). Hereafter, we will review some of the arguments validating this view. First, we will show that the verbal description of objects is influenced by how these objects are coded in terms of action. Then, we will provide evidence that semantic information in language can compete with object-related spatial information in specifying the goal of intentional action as well as behavioural response parameters. Finally, we will show that action-related language can influence how we perceive human actions.