Timeline for Monotonicity of min-entropy
Current License: CC BY-SA 4.0
14 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Apr 20, 2019 at 3:30 | history | edited | Squeamish Ossifrage | CC BY-SA 4.0 | operatorman to the rescue |
| Apr 10, 2019 at 23:21 | vote | accept | Marc Ilunga | ||
| Apr 10, 2019 at 23:21 | vote | accept | Marc Ilunga | ||
| Apr 10, 2019 at 23:21 | |||||
| Apr 10, 2019 at 23:21 | vote | accept | Marc Ilunga | ||
| Apr 10, 2019 at 23:21 | |||||
| Apr 4, 2019 at 0:39 | answer | added | Squeamish Ossifrage | timeline score: 3 | |
| Apr 4, 2019 at 0:24 | answer | added | Marc Ilunga | timeline score: 1 | |
| Apr 3, 2019 at 21:42 | comment | added | Marc Ilunga | I have modified the question to make it clearer, have a look. So I my case i.i.d is not really a concern my random variables are just normal random variables defined over a 'tuple space' so to say. In which case even Shanon entropy applies(checking on Wikipedia). Am I missing somthing? | |
| Apr 3, 2019 at 21:33 | history | edited | Marc Ilunga | CC BY-SA 4.0 | added 364 characters in body |
| Apr 3, 2019 at 21:33 | comment | added | Paul Uszak | I'm saying that in the specific case of non IID data as you suggest, $H^\infty(t) \neq -\log_2(\max P(y))$ as long as $y \in Y$. This equation (as well as $H^{sh}$) only applies to IID variables. The real $H^\infty$ will be lower, perhaps much lower depending on the strength of the auto correlation. It's common to drop the non IID assumption in these situations. Otherwise you end up in a world of Markov chains and pains. | |
| Apr 3, 2019 at 21:17 | comment | added | Marc Ilunga | But I am certain that $y \in Y$. I will edit the notation to make it clearer. Thanks and sorry to the terrible formulation :) | |
| Apr 3, 2019 at 21:05 | history | edited | kodlu | CC BY-SA 4.0 | math formatting etc |
| Apr 3, 2019 at 21:00 | comment | added | Marc Ilunga | Sorry the notation is a bit wonky... So in this case, $Y$ is the range of the random variables, the tuple $(Y_1,...,Y_t)$ would be a random variable with range $Y^t$ for which we know the joint distribution. Finally $y$ is an element of $Y^t$. In which case the min-entropy as written should be well defined.. I hope :) | |
| Apr 3, 2019 at 20:58 | comment | added | Paul Uszak | If $Y$ is not IID, then your equation for $H^\infty$ doesn't apply. It's not that easy and a common mistake. It will over estimate the entropy in some weird proportion to the auto correlation. Consider, if there's a close but diminishing relationship over many $n$ in $Y_n$, what's $y$ exactly? This equation only applies to IID variables. | |
| Apr 3, 2019 at 20:37 | history | asked | Marc Ilunga | CC BY-SA 4.0 |