ABSTRACT

This chapter discusses how to compute the Gaussian likelihood for a linear state-space (SS) model and shows how the presence of unit roots and exogenous inputs critically affects its evaluation. The basic idea of computing the Gaussian likelihood for a linear SS model is to write the likelihood function in a specific form known as “prediction error decomposition.” An alternative to the diffuse likelihood would be a Gaussian likelihood conditional on the minimum subsample required to eliminate the effect of the diffuse states. The chapter presents the minimally conditioned likelihood, a procedure which draws many ideas from the diffuse likelihood but avoids its scaling problem. It discusses the effect of exogenous inputs on the calculation of the likelihood. The chapter also provides a discussion on models with deterministic inputs.