Linear models and analysis of variance are popular. Many phenomena behave linearly and have errors that are Gaussian. The chapter presents a class of models that is as tractable as classical linear models but does not force the data into unnatural scales. Generalized linear models deal with these issues in a natural way by using reparametrization to induce linearity and by allowing a nonconstant variance to be directly incorporated into the analysis. The chapter describes the functions in more detail, and examines more advanced functions for model selection, diagnostics, and creating private families. It focuses on the statistical concepts associated with maximum-likelihood inference, as well as algorithmic details. Apart from simplifying the algorithms for fitting these models, this linearization allows us to use many of the tools intended for linear models and models for designed experiments. The chapter explores the estimation of generalized linear models by maximum likelihood and the associated iteratively-reweighted least-squares algorithm.