ABSTRACT

This article investigates simple mental arithmetic from a computational perspective and proposes an associative connectionist model that integrates semantic and symbolic representations of numbers. To simulate simple addition, we trained neural networks on addition facts, encoded both semantically and symbolically. Addition tasks were then solved by presenting only the symbolic representations of the operands and retrieving the sum. The networks exhibited the benchmark problem-size effect and tie effect, and accounted for a large proportion of the variance of human addition RTs. Studying the networks during retrieval, we found that they exclusively relied on the semantic “computational core”. We conclude that simple mental arithmetic is a semantic process, and that verbal / Arabic numbers mainly serve as an interface.