ABSTRACT

The easiest way to understand regularized regression is to explain how and why it is applied to ordinary least squares. The ridge regression model pushes many of the correlated features toward each other rather than allowing for one to be wildly positive and the other wildly negative. The ridge regression penalty is a little more effective in systematically handling correlated features together. Variable importance for regularized models provides a similar interpretation as in linear regression. Similar to linear and logistic regression, the relationship between the features and response is monotonic linear. Regularized regression provides many great benefits over traditional GLMs when applied to large data sets with lots of features. Regularized regression models assume a monotonic linear relationship.