Nullspace property

From HandWiki

In compressed sensing, the nullspace property gives necessary and sufficient conditions on the reconstruction of sparse signals using the techniques of 1-relaxation. The term "nullspace property" originates from Cohen, Dahmen, and DeVore.[1] The nullspace property is often difficult to check in practice, and the restricted isometry property is a more modern condition in the field of compressed sensing.

The technique of 1-relaxation

The non-convex 0-minimization problem,

minxx0 subject to Ax=b,

is a standard problem in compressed sensing. However, 0-minimization is known to be NP-hard in general.[2] As such, the technique of 1-relaxation is sometimes employed to circumvent the difficulties of signal reconstruction using the 0-norm. In 1-relaxation, the 1 problem,

minxx1 subject to Ax=b,

is solved in place of the 0 problem. Note that this relaxation is convex and hence amenable to the standard techniques of linear programming - a computationally desirable feature. Naturally we wish to know when 1-relaxation will give the same answer as the 0 problem. The nullspace property is one way to guarantee agreement.

Definition

An m×n complex matrix A has the nullspace property of order s, if for all index sets S with s=|S|n we have that: ηS1<ηSC1 for all ηkerA{0}.

Recovery Condition

The following theorem gives necessary and sufficient condition on the recoverability of a given s-sparse vector in n. The proof of the theorem is a standard one, and the proof supplied here is summarized from Holger Rauhut.[3]

Theorem: Let A be a m×n complex matrix. Then every s-sparse signal xn is the unique solution to the 1-relaxation problem with b=Ax if and only if A satisfies the nullspace property with order s.

Proof: For the forwards direction notice that ηS and ηSC are distinct vectors with A(ηSC)=A(ηS) by the linearity of A, and hence by uniqueness we must have ηS1<ηSC1 as desired. For the backwards direction, let x be s-sparse and z another (not necessary s-sparse) vector such that zx and Az=Ax. Define the (non-zero) vector η=xz and notice that it lies in the nullspace of A. Call S the support of x, and then the result follows from an elementary application of the triangle inequality: x1xzS1+zS1=ηS1+zS1<ηSC1+zS1=zSC1+zS1=z1, establishing the minimality of x.

References

  1. Cohen, Albert; Dahmen, Wolfgang; DeVore, Ronald (2009-01-01). "Compressed sensing and best 𝑘-term approximation". Journal of the American Mathematical Society 22 (1): 211–231. doi:10.1090/S0894-0347-08-00610-3. ISSN 0894-0347. 
  2. Natarajan, B. K. (1995-04-01). "Sparse Approximate Solutions to Linear Systems". SIAM J. Comput. 24 (2): 227–234. doi:10.1137/S0097539792240406. ISSN 0097-5397. 
  3. Rauhut, Holger (2011). Compressive Sensing and Structured Random Matrices.