ABSTRACT

This chapter algebraically formulates the least squares fitting of n-observations to a straight-line equation where the solution slope and intercept parameters are determined so that the sum of the squared deviations of the observations from their straight-line predictions is minimum. As measures of the fit, [1] the solution variances are propagated for uniform and variable observation errors, [2] confidence intervals or error bars are developed for the slope, intercept, and line predictions, [3] the correlation coefficient between the observations and predictions is determined along with its statistical significance due to the number of observations involved, and [4] the ANOVA table is constructed to test the regression's statistical significance. The classical linear regression assumes constant error in the observations only. For observations with variable uncertainties, weighted linear regression obtains the line weighted by the inverse data variances. Alternate approaches for observations with variable uncertainties in both the dependent and independent coefficients include [1] reversing the variables in the regression to map out possible solution space, [2} the least squares cubic line that minimizes deviations perpendicular to the line, and [3] the reduced major axis line that minimizes the areas of right triangles between the line and the data point apexes.