ABSTRACT

Nonlinear models are such regression models where the regression function is nonlinear with respect to the parameters. Benefits from such models over the linear models are discussed. Nonlinear models can be parameterized in many different ways, and the role of parameterization is discussed and demonstrated through examples. With nonlinear model, user needs to provide starting values for the parameters, and convergence problems often prevent blind application of the models. Furthermore, the residual errors may be heteroscedastic and dependent. Practical solutions to these issues are presented and illustrated. Often the parameters can be explained with second order predictors. Extension to nonlinear mixed-effects models through introduction of random effects to some parameters is presented. Estimation of the fixed-effect model using a modified Gauss-Newton method and of the mixed-effect model using Lindstrom-Bates algorithm is presented. Both methods are based on a linearization of the model through Taylor approximation. Three alternative methods to predict the y-variable are discussed. It is suggested that the variation of the nonlinear curves caused by random parameters can be illustrated using principal components of the random parameters. A total of 14 examples illustrate the concepts and their use with real-life data-set.