ABSTRACT

Children appear to be sensitive to a variety of partially informative "cues" during language acquisition, but little attention has been paid to how these cues may be integrated to aid learning. Borrowing the notion of learning with "hints" from the engineering literature, we employ neural networks to explore the notion that such cues may serve as hints for each other. A first set of simulations shows that when two equally complex, but related, functions are learned simultaneously rather than individually, they can help bootstrap one another (as hints), resulting in faster and more uniform learning. In a second set of simulations we apply the same principles to the problem of word segmentation, integrating two types of information hypothesized to be relevant to this task. The integration of cues in a single network leads to a sharing of resources that permits those cues to serve as hints for each other. Our simulation results show that such sharing of computational resources allows each of the tasks to facilitate the learning (i.e., bootstrapping) of the other, even when the cues are not sufficient on their own.