ABSTRACT

This chapter discusses a few different approaches, namely, linear regression, nonlinear regression, and kernel smoothing. Linear regression is a commonly used curve-fitting technique. It is also possible to generate confidence intervals for curve fits. Similar to what was done with the earlier linear regression problem, people can also treat a nonlinear regression problem as an optimization problem where we minimize an objective function. Kernel smoothing methods can be useful in cases where polynomials or other functions may not be appropriate. A kernel smoothing method estimates the fit at a specific point, but nearby data points are weighted more strongly than data points further away. Such a fit is said to be "data driven" in that it is not tied to any particular functional form. The performance of the local polynomial method depends on the bandwidth and the order of the polynomial, so it is necessary to vary these parameters to obtain reasonable fits.