ABSTRACT

Deterministic parsing promises to (almost) never backtrack. Neural network technology promises generalization, competition, and learning capabilities. The marriage of these two ideas is being investigated in an experimental natural language parsing system that combines some of the best features of each. The result is a deterministic parser that learns, generalizes, and supports competition among structures and lexical interpretations.

The performance of the parser is being evaluated on predicted as well as unpredicted sentence forms. Several mildly ungrammatical sentences have been successfully processed into structures judged reasonable when compared to their grammatical counterparts. Lexical ambiguities can create problems for traditional parsers, or at least require additional backtracking. With the use of neural networks, ambiguities can be resolved through the wider syntactic context. The results have shown the potential for parsing using this approach.