ABSTRACT

Before the 1970s numerous regression methods were used, including Poisson regression, logistic regression and probit regression. Each regression model required a unique estimation algorithm by maximizing the specific likelihood for that model. The normal probability plot shows that there exist two outliers. A solution to inference for any fixed unknowns was proposed by Fisher. Fisher developed a likelihood theory, expressing the probability to observe the data as a function of the parameter value. The notation used for the likelihood here is that the parameters and data are separated by a semicolon. GLM extends the linear model to the exponential family of distributions. Deviance residuals are the best normalization transformation under the GLM class of models. Birnbaum proved that the classical likelihood function contains all the information in the observed data about the fixed parameter, provided that the assumed stochastic model is right.