ABSTRACT

Sign languages of Deaf people communicate multiple meanings. They go beyond the gestural systems that speaking individuals use to communicate without speech, since they are full-fledged languages in their own right. A critical neurosemiotic question thus arises: What are the brain mechanisms that extract meanings from such gestural body actions? How do they come to signify? In this chapter I outline how discrete neural pathways can parse observed bodily actions during sign-language processing to afford a key neurosemiotic distinction between “things” and “acts.” This functional separation of cortical pathways suggests a biological basis for how events that carry meaning become embedded in languages, including those that are signed.