ABSTRACT

Information was first introduced by R. A. Fisher in a mathematically precise formulation for his statistical estimation theory. A second important concept in the information theory that is of statistical nature is the Kullback–Leibler distance, introduced in 1951 by Solomon Kullback and Richard Leibler. Their motivation for this concept partly arises from “the statistical problem of discrimination,” as they put it. When item response theory was initially developed, A. Birnbaum introduced the concept of information to quantify power of ability-level classification and precision of ability-level estimation. The Fisher information function can be readily used to assess the accuracy/power of an item response theory based test, as long as the maximum likelihood estimation is used for scoring. The Fisher information, due to its connection to the maximum-likelihood estimation and the associated asymptotic theory, has been widely used to define item and test information.