chapter  2
Simple Models: Definitions of Error and Parameter Estimates
Pages 15

Although we will sometimes want to consider a specified or hypothesized value of B0 in order to ask an explicit question about data, it is much more common to consider the equation:

Yi = β0 + εi

where β0 is a true parameter that is estimated from the data. Continuing with the medical example, suppose that the body temperatures were all from people who had taken a certain drug. We might suspect that, except for error, they all have the same body temperature, but it is not the usual body temperature of 37°C. We use β0 to represent whatever the body temperature might be for those who

have taken the drug. It is important to realize that β0 is unknowable; we can only estimate it from the data. In terms of these true values, εi is the amount by which Yi differs from β0 if we were ever to know β0 exactly. We use b0 to indicate the estimate of β0 that we derive from the data. Then the predicted value for the ith observation is:

Yˆi = b0




Yi = b0 + ei

where ei is the amount by which our prediction misses the actual observation. Thus, ei is the estimate of εi. The goal of tailoring the model to provide the best fit to the data is equivalent to making the errors:

ei = Yi − b0

as small as possible. We have only one parameter, so this means that we want to find the estimate b0 for that one parameter β0 that will minimize the errors. However, we are really interested not in each ei but in some aggregation of all the individual ei values. There are many different ways to perform this aggregation. In this chapter, we consider some of the different ways of aggregating the separate ei into a summary measure of the error. Then we show how each choice of a summary measure of the error leads to a different method of calculating b0 to estimate β0 so as to provide the best fit of the data to the model. Finally, we consider expressions that describe the “typical” error.