3
$\begingroup$

A key result in compressive sensing states that, with high probability on the random draw of an $\ m × N$ Gaussian or Bernoulli matrix $\ A$, all $\ s$-sparse vectors $\ x$ can be reconstructed from $\ y = Ax$ using a variety of algorithms provided

$\ m ≥ Cs\ln(N/s)$,

where $\ C > 0$ is a universal constant (independent of $\ s, m,$ and $\ N$). This bound is in fact optimal.

In practical terms, how does one interpret/find $\ C$?

source: http://www.cis.pku.edu.cn/faculty/vision/zlin/A%20Mathematical%20Introduction%20to%20Compressive%20Sensing.pdf

$\endgroup$

2 Answers 2

4
$\begingroup$

The constant C is usually left unspecified as it can be very hard to calculate. It is independent of the signal dimension, and results in Compressed Sensing are generally only concerned with big-O results, so it falls out when we say something like $m = \mathcal{O}(s \ln{(N/s)})$. The important part in this statement is the strong dependence on the sparsity level, and that the signal dimension $N$ only logarithmically affects the number of measurements.

To give an example of where such a constant would come from: Let $V_d$ be the volume of the unit $\ell_2$ ball in $\mathbb{R}^d$. Some simple cases are that $V_2 = \pi r^2$ and $V_3 = \frac{4}{3} \pi r^3$. Note that both of these results are of the form $V_d = C r^d$, and this pattern generalizes for finite $d$. In CS, it is $r^d$ that we are interested in, as it captures the dependence on the signal dimension.

So $C$ will vary depending on the algorithm as different inequalities may be used in the proofs. I don't know of any publications that specifically calculate C.

$\endgroup$
1
  • $\begingroup$ thanks, this answer is really clear to me. This means that given a missing data problem (e.g. matrix completion) the best I can do if asked "do you have enough samples?" is to give a ballpark Yes or No. Then actually test it empirically (assuming suitable incoherence). $\endgroup$ Commented Feb 14, 2014 at 4:52
1
$\begingroup$

C depends , among other things, on the reconstruction algorithm.

$\endgroup$
2
  • 1
    $\begingroup$ can you explain your answer with more details? $\endgroup$ Commented Feb 10, 2014 at 22:55
  • $\begingroup$ @Igor Perhaps I could have a brief example? Do you mean, for instance, that using a hard/soft thresholding algorithm vs orthogonal matching pursuit would result in different C values? What other things might there be to influence C? Can you point me to any references where they specifically address what things have an effect on C? Thanks! $\endgroup$ Commented Feb 11, 2014 at 3:49

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.