Breadcrumbs Section. Click here to navigate to respective pages.

Chapter

Chapter

# Least-squares and Gauss–Markov theory

DOI link for Least-squares and Gauss–Markov theory

Least-squares and Gauss–Markov theory book

# Least-squares and Gauss–Markov theory

DOI link for Least-squares and Gauss–Markov theory

Least-squares and Gauss–Markov theory book

## ABSTRACT

THEOREM 2.1.1 Let X be any n k matrix. Then the sum of squares e0e D .y Xb/0.y Xb/ is minimized with respect to b if and only if b is a solution of the “normal equations”

X 0XbD X 0y: (n)

Proof: Let Ob be a solution of (n), and write

eD y XbD y X ObC X . Ob b/:

Then

e0eD .y X Ob/0.y X Ob/C . Ob b/0X 0X . Ob b/ C .y X Ob/0X . Ob b/C . Ob b/0X 0.y X Ob/:

The last two terms on the right vanish, since X 0.y X Ob/D0 from (n), and the first term is a constant. Therefore, e0e is minimized with respect to b when the second term is so minimized. Writing this term as u0u, where u D X . Ob b/, we see that it vanishes for bD Ob and, being a sum of squares, it vanishes if and only if uD 0,

if and a solution of (n).