8 Pages

Appendix A Mathematics Behind the Classical Linear Regression Model

A.1 Computing Regression Parameters in Simple Linear Regression To compute the estimates βˆ0 and and βˆ1,we have to maximise the sum of the squared residuals, that is,

(yi−β0−β1xi)2 .

The reader hopefully remembers from high school mathematics that a minimisation task in several arguments can be solved by taking the derivatives with respect to each argument and setting these expressions equal to 0. Hence, we have to take the derivatives with respect to β0 and β1 of the above sum of squares. Using

∂ ∂β0

(yi−β0−β1xi)2 = −2(yi−β0−β1xi) and ∂ ∂β1

(yi−β0−β1xi)2 = −2xi(yi−β0−β1xi)

we obtain that βˆ0 and βˆ1 are the solutions of the equations n ∑ i=1 −2(yi− βˆ0− βˆ1xi) = 0 and

n ∑ i=1 −2xi(yi− βˆ0− βˆ1xi) = 0

We can rewrite these equations as

Dividing both equations by n and using the abbreviations

x = 1n n ∑ i=1

xi, y = 1n n ∑ i=1

yi, yx = 1n n ∑ i=1

xiyi, and x2 = 1n n ∑ i=1

we obtain βˆ0 + βˆ1x = y and βˆ0x+ βˆ1x2 = yx .