ABSTRACT

The presence of noise in data is an unavoidable issue due to sensor imperfection, estimation inaccuracy, statistical or communication errors. For instance, signals might be contaminated by some form of random noise and the measurements of signals are subject to quantization error. Thus a huge effort is made in sparse data recovery, to ensure that the recovery method is stable in the sense that recovery errors stay under control when the measurements are slightly inaccurate and when the data is not exactly sparse [37, 95, 97, 108]. The stability of many recovery algorithms, including 1-minimization, has been extensively studied under various assumptions such as the RIP, NSP and mutual coherence. In this chapter, we introduce a unified approach to establish a stability theory for linear 1-minimization problems, including linear Dantzig Selector (DS), under a mild assumption called the weak range space property of a transposed sensing (or design) matrix. This matrix property is shown to be a necessary and sufficient condition for the standard 1-minimization method to be stable in the sense of Definition 5.1.1 in this chapter. The classic Hoffman’s Lemma concerning the error bound of linear systems is employed as a major tool in this chapter to the study of stability of linear 1-minimization. This tool combined with the sparsity assumption provides some recovery error bounds measured by the so-called Robinson’s constant for linear sparse optimization methods. A byproduct of this study is a unified stability result for linear 1-minimization. This result holds under various matrix properties which are widely utilized in traditional compressed sensing theory, including the RSP of order k of AT which was introduced to characterize the uniform recovery of k-sparse vectors via 1-minimization (see Chapter 3).