Tightening information theoretic constraints on noisy compressive sensing
Résumé
Recently, information-theoretic barriers of compressive sensing (CS) have been studied by several authors. Sarvotham et al. derived a bound on the number of measurements m using the source-channel separation theorem. Similar bounds were derived by Wainwright and Aeron et al. using Fanos inequality and the source-channel separation theorem. In these papers, the noisy CS is modeled as a communication system and an upper bound on the system capacity is used to derive the bounds. However, for the derivation of these bounds, the signal source memory was not considered. Furthermore, the derived bounds only exhibits a minor dependence on m when m is large.
In this paper, we derive tighter bounds on m by looking at the information rate (IR) of noisy CS. We derive a lower bound on m based on an upper bound on the IR for wide-sense stationary sparse signals and Gaussian sampling matrices, taking
into account the memory of the sparse signal. We also consider the expected value of the IR for the ensemble of Gaussian sampling matrices and derive a sufficient condition on m to recover the signal.