2
$\begingroup$

Let $Z^t = (Y_1,\ldots,Y_t)$ be a sequence of random variables each taking values in $Y$. The random variables are not necessarily i.i.d but we know the joint distributions. i.e for every $z = (z_1,...,z_t)$ we know $P^{Z^t}(z)$

The min-entropy of a random variable $X$ is defined as $-\log_2(\max P(x) )$ for x in the value set of $X$.

Finally, we define a sequence of values $H^\infty(t) = -\log_2(\max P^{Z^t}(z))$ for $Z^t$ defined as above.

How can we show that $H^\infty(t)$ is monotonically increasing in t? i.e for $t' \geq t$ it is the case that $H^\infty(t') \geq H^\infty(t)$.

What I am trying without success is to show that $\max P^{Z^t}(z^t) \geq \max P^{Z^{t'}}(z^{t'})$ where $t \leq t'$.

$\endgroup$
5
  • $\begingroup$ If $Y$ is not IID, then your equation for $H^\infty$ doesn't apply. It's not that easy and a common mistake. It will over estimate the entropy in some weird proportion to the auto correlation. Consider, if there's a close but diminishing relationship over many $n$ in $Y_n$, what's $y$ exactly? This equation only applies to IID variables. $\endgroup$ Commented Apr 3, 2019 at 20:58
  • $\begingroup$ Sorry the notation is a bit wonky... So in this case, $Y$ is the range of the random variables, the tuple $(Y_1,...,Y_t)$ would be a random variable with range $Y^t$ for which we know the joint distribution. Finally $y$ is an element of $Y^t$. In which case the min-entropy as written should be well defined.. I hope :) $\endgroup$ Commented Apr 3, 2019 at 21:00
  • $\begingroup$ But I am certain that $y \in Y$. I will edit the notation to make it clearer. Thanks and sorry to the terrible formulation :) $\endgroup$ Commented Apr 3, 2019 at 21:17
  • $\begingroup$ I'm saying that in the specific case of non IID data as you suggest, $H^\infty(t) \neq -\log_2(\max P(y))$ as long as $y \in Y$. This equation (as well as $H^{sh}$) only applies to IID variables. The real $H^\infty$ will be lower, perhaps much lower depending on the strength of the auto correlation. It's common to drop the non IID assumption in these situations. Otherwise you end up in a world of Markov chains and pains. $\endgroup$ Commented Apr 3, 2019 at 21:33
  • 1
    $\begingroup$ I have modified the question to make it clearer, have a look. So I my case i.i.d is not really a concern my random variables are just normal random variables defined over a 'tuple space' so to say. In which case even Shanon entropy applies(checking on Wikipedia). Am I missing somthing? $\endgroup$ Commented Apr 3, 2019 at 21:42

2 Answers 2

3
$\begingroup$

Let's make it slightly simpler by proving the case of two variables with no tuple indexing to clutter it up. Fix two random variables $X$ and $Y$. Is $H_\infty[X] \leq H_\infty[(X, Y)]$?

For each $x$, we have $$\Pr[X = x] = \sum_y \Pr[X = x, Y = y].$$ Note that since probability masses are always positive, $\max \Pr[\cdots] \leq \sum \Pr[\cdots]$; then the sense will get reversed because $p \mapsto -\log p$ is a decreasing function:

\begin{align} H_\infty[X] &= -\log \max_x \Pr[X = x] \\ &= -\log \max_x \sum_y \Pr[X = x, Y = y] \\ &\leq -\log \max_x \max_y \Pr[X = x, Y = y] \\ &= -\log \max_{x,y} \Pr[X = x, Y = y] \\ &= H_\infty[(X, Y)]. \end{align}

Then to prove that $t \mapsto H_\infty[(Z_1, \dots, Z_t)]$ is increasing, take $X = (Z_1, \dots, Z_t)$ and $Y = Z_{t+1}$.

$\endgroup$
1
  • $\begingroup$ Accepted Answer for simplicity, clarity and didactic value. :) $\endgroup$ Commented Apr 20, 2019 at 6:58
1
$\begingroup$

I think I've found the solution.

Let $z = (z_1,\ldots,z_{t+1})$ be the value that maximizes $P^{Z^{t+1}}[\cdot]$ and the probability is $Pz$. Let $x = (x_1,\ldots,x_t)$ the value that maximizes $P^{Z^{t}}[\cdot]$ and the probability is $P_x$; finally let $Py = \sum_{l \in Y} Pr[(z_1,\ldots, z_t,l)]$. $Py$ is then the probability of $y = (z_1,\ldots,z_t)$ i.e the first $t$ elements of $z$.

It follows then that $P_z \leq P_y \leq P_x$.

Which shows that $H^\infty(t)$ is monotonically increasing.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.