ABSTRACT

This chapter compares the approaches to a new technique in which BCH error-correcting codes are employed as a distributed output representation. It shows that the output representations improve the performance of ID3 on the NETtalk task and of backpropagation on an isolated-letter speech-recognition task. To answer the questions, the chapter begins with a study in which the decision-tree algorithm ID3 is applied to the NETtalk task using three different techniques: the direct multiclass approach, the one-per-class approach, and the distributed output code approach. The experiments in the chapter demonstrate that error-correcting output codes provide an excellent method for applying binary learning algorithms to multiclass learning problems. Ties were broken in favor of the phoneme/stress pair that appeared more frequently in the training data. In the isolated-letter speech-recognition task, the "name" of a single letter is spoken by an unknown speaker and the task is to assign this to one of 26 classes corresponding to the letters of the alphabet.