ABSTRACT

This chapter provides a selective overview of nonconvex penalized quantile regression in high dimension. Quantile regression is a widely recognized useful alternative to the classical least-squares regression. The most prominent feature of quantile regression is its ability to incorporate heterogeneity, which can arise from heteroskedastic variances or other sources beyond the commonly used location–scale models. Quantile regression allows the covariates to influence the location, dispersion, and other aspects of the conditional distribution. Computationally, quantile regression can be formulated as a convex optimization problem where the objective function has the form of asymmetrically weighted absolute values of residuals. Quantile regression enjoys several other appealing properties. It is naturally robust to outliers in the response space. The chapter reviews nonconvex penalized linear quantile regression in ultra-high dimension. Semiparametric quantile regression is important for high-dimensional data analysis for several reasons. The chapter summarizes the large-sample properties of the oracle estimator and the penalized quantile regression estimator.