ABSTRACT

The nature of computational linguistics (CL) has changed radically and repeatedly through the last three decades. From the ATN-based implementations of transformational grammar in the 1960s, through the explicitly linguistics-free paradigm of Conceptual Dependencies, 1 to the influence and applications of 1980s-era unification-based frameworks, CL has alternated between defining itself in terms of and in opposition to mainstream theoretical linguistics. Since the late 1980s, it seems that a growing group of CL practitioners has once more turned away from formal theory. In response to the demands imposed by the analysis of large corpora of linguistic data, statistical techniques have been adopted in CL which emphasize shallow, robust accounts of linguistic phenomena at the expense of the detail and formal complexity of current theory. Nevertheless, we argue in this chapter that the two disciplines, as currently conceived, are mutually relevant. While it is indisputable that the granularity of current linguistic theory is lost in a shift toward shallow analysis, the basic insights of formal linguistic theory are invaluable in informing the investigations of computational linguists; and while corpus- based techniques seem rather far removed from the concerns of current theory, modern statistical techniques in CL provide very valuable insights about language and language processing, insights which can inform the practice of mainstream linguistics.