ABSTRACT
Low-rank structures play an important role in signal processing
and machine learning [1-6], with various applications ranging
from digital filter designs and medical imaging to dimensionality
reduction and sensor network localization. In these applications,
high-dimensional data can be approximately modeled as lying in
a low-dimensional subspace or manifold. Under this assumption, a
variety of data processing tasks (e.g., noisy data filtering, missing
data interpolation, principle components learning, etc.) can be
successfully accomplished. Furthermore, low-rank properties also
result in significant reduction in computation and storage, that is
extremely important in big data scenarios and leads to a plethora of
recent progress in low-rankmodeling techniques and computational
efficient numerical algorithms.