ABSTRACT

In Chapter 1 we introduced the expected Fisher information of a parametric family, which is the matrix of functions E p ( ∂ l ∂ θ i ∂ l ∂ θ j ) https://s3-euw1-ap-pe-df-pch-content-public-p.s3.eu-west-1.amazonaws.com/9781315141268/7ec67f00-5aa3-4444-a5a8-38b4589858e0/content/eq805.tif"/> The earliest ideas about a relationship between statistics and differential geometry stem from the interpretation of the Fisher information as a Riemannian metric by Rao (1945). A Riemannian metric on a manifold is an inner product given on each tangent space. This is not the only Riemannian metric that can be defined on a statistical manifold. Barndorff-Nielsen (1986a) has introduced the observed information, which also acts as a Riemannian metric, and a whole theory of observed geometry which includes a general theory for producing such Riemannian metrics, called yokes. This theory of yokes also can also be used to define the expected Fisher information.