## Information metrics and statistical divergences

In Chapter 1 we introduced the expected Fisher information of a parametric family, which is the matrix of functions E p ( ∂ l ∂ θ i ∂ l ∂ θ j ) The earliest ideas about a relationship between statistics and differential geometry stem from the interpretation of the Fisher information as a Riemannian metric by Rao (1945). A Riemannian metric on a manifold is an inner product given on each tangent space. This is not the only Riemannian metric that can be defined on a statistical manifold. Barndorff-Nielsen (1986a) has introduced the observed information, which also acts as a Riemannian metric, and a whole theory of observed geometry which includes a general theory for producing such Riemannian metrics, called yokes. This theory of yokes also can also be used to define the expected Fisher information.