ABSTRACT

In many applications the likelihood function involves several parameters, only a few of which are of interest to the investigator. Two difficulties arise in dealing with likelihood functions that depend on a large number of incidental parameters in addition to the effects of interest. First, from a purely mathematical point of view, there is no guarantee of consistency or optimality in the limit as the number of parameters increases in proportion to the data accumulated. The second difficulty is the purely numerical one of maximizing a function of many variables and of obtaining the inverse matrix of second derivatives, but this is a subsidiary consideration in view of the first difficulty. One way of eliminating unwanted nuisance parameters is to work with the marginal likelihood for a suitably chosen subset of the complete data vector. In order to highlight the differences between the conditional log likelihood and the unconditional log likelihood.