ABSTRACT

This chapter introduces alternative ways to assess agreement among raters or multiple sources of information. It describes the intraclass correlation coefficient (ICC) and different models. The chapter illustrates how the ICC is related to Pearson's correlation coefficient p, and an internal consistency measure, Cronbachs. ICCs can be interpreted as the extent to which there is an agreement (consistency or absolute agreement) among raters when objects are randomly selected. The generalizibility coefficient and Cronbachs alpha are special cases of two-way random average score ICC with consistency definition of agreement. Correlation structures can be compared using methods of structural equations modeling. In studies of human development and developmental psychopathology, a need for multiple behavioral measures via multiple sources has been documented. For example, a child's disruptive behavior can be reported by many sources, including the child, mother, father, and teachers, all using the same questionnaire with common metric and variance.