ABSTRACT

Learning and exploiting a low-rank structure from the corrupted observation is widely used in machine learning, data mining, and computer vision. This chapter gives a brief overview of two kinds of low-rank recovery methods-convex nuclear-norm based methods and nonconvex matrix factorization methods. In general, convex nuclear-norm-based methods are guaranteed to reach the global minima with cubic computational complexity, while nonconvex matrix factorization methods are more computationally efficient (quadratic complexity) but suffer from poor convergence. Motivated by these two kinds of low-rank recovery methods, this chapter introduces a computationally efficient low-rank recovery method, called Robust Orthonormal Subspace Learning (ROSL). As is different from convex methods using nuclear norm, ROSL utilizes a novel rank measure on the low-rank matrix that imposes the group sparsity of its coefficients under orthonormal subspace. This new rank measure is proven to be lower bounded by (i.e., has the same global minimum as) nuclear norm, and is experimentally shown to converge to its global minimum with high probability. In addition, this chapter describes a faster version (ROSL+) empowered by random sampling, which further speeds up ROSL from quadratic complexity to linear complexity.