ABSTRACT

This chapter introduces linear regression. It begins with a discussion of the difference between simple and multiple regression and some terminology (e.g., predictor and outcome variables). The utility of multiple regression is then discussed, focusing on the ability to examine the unique predictive power of one variable while controlling for the effects of other predictor variables. The “In Depth” section of the chapter begins with a discussion of the assumptions of regression, including some methods for addressing violations of those assumptions. Next, the formula for calculating a simple linear regression is presented, and the concept of the ordinary least-squares regression line is explained. The regression coefficient, intercept, and regression equation for calculating predicted values of Y are explained next and illustrated with a worked example. Assumptions about the residuals in regression are presented next. The remainder of the chapter is devoted to multiple regression. After a description of why multiple regression is so powerful, examples of multiple regression use in published research are presented, followed by a detailed explanation of Statistical Package for the Social Sciences (SPSS) output using multiple regression. The multiple R coefficient, R 2, Beta coefficient, and unstandardized regression coefficients are identified and interpreted. The chapter concludes with the presentation of another multiple regression example using SPSS output.