ABSTRACT

Reliability coefficients for composites of variables traditionally are derived on the basis of an assumed decomposition of the component variables into unobserved additive true and error parts obeying strong assumptions about the dimensionality of the latent variables as well as possible homogeneities of their variances. A reliability coefficient can be defined for this classic linear latent variable model under the minimal assumptions that the true and error scores are independent, that error scores are mutually independent, and that their covariance matrices are Gramian. The greatest lower bound to this minimal assumption reliability coefficient is introduced, and its history is traced. The relation of the greatest lower bound to other reliability coefficients is established. Computational theory and practice are reviewed, and some new results are developed. Sampling characteristics of the coefficient, and especially its asymptotic variance and bias, are discussed, and some new results are described. The concept of the greatest statistical lower bound to reliability is introduced. Some examples are provided to illustrate the approach.