ABSTRACT
In this chapter we will develop algorithms for computing the matrices L and L−1 that arise in the Cholesky factorization of Var(y), where y is the vector of responses from a state-space model. There are several reasons for considering this particular problem. Our approach to the signal-plus-noise prediction prob-
lem revolves around the innovation vectors that are computed by applying the Gramm-Schmidt orthogonalization method to the response vectors. In Section 1.2.3 we defined innovations recursively by starting with ε(1) = y(1) and then successively computing
ε(t) = y(t) − t−1∑ j=1
L(t, j)ε(j), (3.1)
for t = 2, . . ., n, with L(t, j) being the matrix in the jth block column of the tth block row of the lower triangular matrix L in the Cholesky decomposition
Var(y) = LRLT . (3.2)
Thus, computation of the innovations is intimately linked
(3.1) arises from the forward substitution step for solving the lower triangular, linear equation system Lε = y: that is,
ε = L−1y.