ABSTRACT

The central issue in statistics is that we have observations from some probability law which is unknown. Inference is the art of saying something about this unknown probability law. Consider the problem where we observe Y ; let f∗Y be the unknown true density of Y . A model is a family of distributions, generally characterized by a family of densities {fθY }, θ ∈ Θ. The family of densities is indexed by a parameter θ which takes its values in a set Θ. A model is parametric if Θ is included in Rp for some p. The model is well specified if f∗Y ∈ {fθY }: that is to say if there exists θ∗ ∈ Θ such that f∗Y = fθ

is the true parameter value). In many statistical texts, the distinction is not clear between the true value θ∗ and other possible values of θ which indexes the distributions in the model. For simplifying the notation, one often says that one wishes to estimate θ in the model {fθY }, θ ∈ Θ, when one should say that one wishes to estimate θ∗.