ABSTRACT
The human perceptual system has evolved to integrate information from each of the senses (i.e., vision, audition, somatosensory, olfaction, and gustation). Yet, in everyday life, hearing and seeing tend to be treated as separate, independent entities. When we exercise our visual faculty, as when searching for a friend at a crowded party, we think of this task as purely visual. Similarly, having found the friend, when we strain our ears to hear his or her words in the din of the party, we think we are merely exercising our auditory abilities. Most of us tend to think of the senses as independent. After all, many of us know someone who has lost one or the other sense-a friend or relative who is blind or someone who is deaf. But, among those who have both senses, little thought is given to the in¡uence of one sense on the other. As discussed in this chapter, however, vision, audition, and tactile (touch) abilities can have remarkable in¡uences on each other. The senses are interdependent. Hearing a sound helps us direct visual attention to the spatial location of the sound (Perrott, Cisneros, McKinley, & Dangelo, 1996). A visual stimulus can enhance our ability to interpret auditory information (i.e., understanding speech, judging distance). In fact, the auditory cortex can be activated by silent lipreadingwhen absolutely no sound is present (see Calvert, Bullmore, Brammer, & Campbell, 1997). This chapter examines a number of such cross-modal interactions (primarily between vision and hearing), as well as issues concerning language processing by eye and ear.