ABSTRACT

This chapter introduces RapidMiner's Neural Net operator for supervised learning and RapidMiner's Self-Organizing Map (SOM) operator for unsupervised clustering. The work-flow-based nature of RapidMiner gives a great deal of flexibility in the experiments with these operators. RapidMiner's Neural Network operator uses backpropagation learning to build supervised models for estimation, classification, and prediction. Several of RapidMiner's attribute preprocessing operators are of particular interest including operators that eliminate correlated and irrelevant attributes. To design the network architecture, one can modify default settings for the number of hidden layers, the number of hidden-layer nodes, and the number of training epochs. Several additional parameters can be adjusted to help the authors to build accurate neural net models. The chapter uses a simple five-step approach for supervised neural net learning: identify the goal; prepare the data to be mined; define the network architecture; train the network; and read and interpret summary results.