3
$\begingroup$

Let $X$ be a random variable such that $|X| \leq A$ almost surely, for some $A > 0$. Let $Z$ be independent of $X$ such that $Z \sim {\cal N}(0, N)$. My question is:

How large can the entropy $h(X+Z)$ be?

A simple upper bound obtained by considering the maximum possible variance of $X+Z$ is $\frac{1}{2} \log (2\pi e (A^2 + N))$. Is there a better bound in terms of $A$ and $N$?

$\endgroup$

1 Answer 1

2
$\begingroup$

Just some hints.

We can consider this as an additive gaussian channel, with noise $Z$, bounded input $X$ and output $W=X+Z$. Furthermore, $h(W) = h(Z) + h(X) - h(X|W)=h(Z)+I(X;W)$

For the first equality see here. Because $h(Z)$ is fixed, our problem of maximizing $h(W)$ is then equivalent to finding the pdf for $X$ that maximizes the mutual information, that is, to find the capacity of this channel. This is not trivial - see eg the introduction of this paper.

To get some bound, I'd try with the distribution that achieves that capacity in some cases: two Dirac deltas at $X=\pm A$ with same weight. In this case, the pdf of $W$ would be the mix of two Gaussians, and its entropy is studied here (no simple close result, but you might get some useful bound).

$\endgroup$
1
  • $\begingroup$ Thanks for your response. The paper by Raginsky refers to an old paper by Smith, in which lies an algorithm to numerically compute the maximum $I(X;W)$. An analysis of this algorithm could give an upper bound, but it doesn't look easy! $\endgroup$ Commented Aug 8, 2014 at 17:53

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.