ABSTRACT

While the assessment of semantic phenomena forms a very importantstep of developing a semantic theory, theorizing itself only starts when one tries to explain the phenomena. For example, the senses of uncle and aunt are related in the same way as those of sister and brother, and Violetta is an aunt of Christopher logically entails Christopher is a nephew of Violetta (provided Christopher refers to a male person and Violetta to a female one). But how can this be derived from the meanings of the relevant terms? Likewise, verbs carry selectional restrictions for their arguments (6.7). How can this be accounted for in a principled way? There are many different semantic theories. Each one defines the notion of meaning differently and uses different means for the representation of meanings, some formal, some informal. Almost all of them, however, share a basic strategy of explaining the semantic data, a strategy that ties in with our intuitive notion of meaning: they assume that the meanings of most words are complex, composed of more general components. The meaning of a lexical item is analysed by identifying its components and the way in which they are combined. In a sense, the process of semantic analysis is the converse of the process of composition by which we determine the meaning of a complex expression on the basis of its components and its structure. Therefore analysis into meaning components is called decomposition. A decompositional theory allows the pursuit of the following objectives:

M Meaning. Providing models for word meanings. What kind of entities are lexical meanings? What is their structure? How are meanings to be represented?