ABSTRACT

Word Grammar (WG) agrees with other theories in the family of cognitive linguistics that we use the same mental apparatus for language as for other kinds of knowledge; but some of its assumptions about this knowledge are distinctive, and lead to distinctive linguistic analyses. Language is a single integrated network of atomic nodes, and so is the structure of a sentence: a rich dependency structure rather than a tree. The underlying logic is default inheritance applied to taxonomies which include relational concepts as well as entities, so the language network contains an open-ended taxonomy of relations; and these taxonomies extend upwards into general knowledge as well as downwards into the tokens and ‘sub-tokens’ of performance. These sub-tokens interact with dependency structure to create a new token of the head word for each dependent, a new compromise between dependency structure and phrase structure. The cognitive assumptions also lead to insightful analyses of logical semantics, lexical semantics, learning, processing, and sociolinguistic structures.