ABSTRACT

The knowledge of vector and matrix algebra is essential in statistical inference of linear models. It provides a useful mathematical tool for the study of the structure and geometric interpretation of linear models. The chapter presents the derivative of a matrix but also to provide some useful results in the simple form of matrices. It provides some advances which may help in the understanding of linear model theory and its applications. The concept of eigenvalues is important in matrix theory which has an impact on linear statistical inference. In linear statistical inference, the technique for obtaining the derivative of a matrix is useful. For example, the derivative of a matrix is needed to obtain the maximum likelihood estimators of the parameters of interest and the information matrix. Since a matrix can be expressed in the form of vector, the technique for obtaining the derivative of a multivariate function can be applied directly.