ABSTRACT

Resampling and simulation methods for regression models may apply various methods covered in previous chapters. In Chapter 8 we have seen an application of the leave-one-out jackknife for model selection in regression. This illustration of cross-validation is a simple example of a popular and powerful method that is widely applied in statistical learning. Linear models are fundamental in statistics, so this chapter discusses some examples of simulation for regression models. Jackknife-after-bootstrap is another useful technique that we have already implicitly applied in Chapter 8 by the underlying computational methods for the BCa bootstrap confidence intervals. In this chapter some details and applications of jackknife-after-bootstrap such as empirical influence values are discussed. Many references are available on these topics. For resampling and linear models, Faraway [94, 95], Fox [100], and Davison and Hinkley [68] are good resources that feature implementation in R. Also see Chapter 3 in James et al. [157] and Chapter 6 in Venables and Ripley [293]. Other textbooks on applied regression analysis include Fox [100], Kutner et al. [173], Mendenhall and Sincich [202], Montgomery et al. [211], and Weisberg [310]. See James et al. [157] for an accessible introduction to variable selection, validation, cross-validation, and applications in the general subject of data mining.