8
$\begingroup$

In “Randomness Condensers for Efficiently Samplable, Seed-Dependent Sources” by Dodis, Ristenpart, and Vadhan (PDF), I have seen the statement that:

any tuple of distributions $(X,Z)$ is $\varepsilon$-close to $(X′, Z)$ such that $$H_\infty(X′\mid Z) ≥ H_2(X\mid Z) − \log (1/\varepsilon)$$ offering a bound on min-entropy which may be better than $H_\infty(X′\mid Z) ≥ H_2(X\mid Z)/2$.

I can not see why this is the case. Could anyone please shed some light? Does it involve the Leftover Hash Lemma!?

$\endgroup$
0

1 Answer 1

6
$\begingroup$

The proof (courtesy of M. Skorski) relies on the so-called "smoothing" technique [Cac97,RW,Sko15].

For some $0<\epsilon<1$, let $B$ be the subset of heaviest points in $X$ whose mass add up to $\epsilon$ -- i.e.: $$\sum_{x\in B} p_X(x)=\epsilon, \text{ and } \min_{x\in B}p_X(x)\geq\max_{x\in\bar{B}}p_X(x).$$ Let $X'$ denote the conditional distribution of $X$ on $\bar{B}$ -- i.e.: $$ p_{X'}(x):= \begin{cases} 0 & x\in B,\\ \frac{p_{X}(x)}{(1-\epsilon)} & x\in\bar{B}. \end{cases} $$ Note that the statistical distance between $X$ and $X'$, $\Delta(X,X')=\epsilon$.$^1$ Let $m:=\max_{x\in\bar{B}}p_X(x)$. Now, let's consider the collision entropy of $X$: $$H_2(X):=-\log\sum_{x\in X} p_X^2(x)\leq-\log\sum_{x\in B} p_X^2(x)\leq^2-\log (m\epsilon).$$ It follows that $-\log(m)\geq H_2(X)-\log(1/\epsilon)$, and the results follows as $H_\infty(X')=-\log(m/(1-\epsilon))$.

A similar argument can be made for a joint distribution too.

References: [Cac97]: C. Cachin. Smooth Entropy and Renyi Entropy, EUROCRYPT'97. [RW]: R. Renner and S. Wolf. Smooth Renyi Entropy and its Applications. [Sko15]: M. Skorski. How to Smooth Entropy? SOFSEM'16

Footnotes: 1. Proof: \begin{align} \Delta(X,X') &=\frac{1}{2}\cdot\sum_{x\in X}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\sum_{x\in B}\left|p_X(x)\right|+\sum_{x\in \bar{B}}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\left(\epsilon+ \sum_{x\in \bar{B}}p_X(x)\left|1-\frac{1}{1-\epsilon}\right|\right)\\ &=\frac{1}{2}\cdot\left(\epsilon+ (1-\epsilon)\left|1-\frac{1}{1-\epsilon}\right|\right)=\epsilon \end{align} 2. Proof: $\sum_{x\in B} p_X^2(x)\geq \sum_{x\in B}m\cdot p_X(x)= m\cdot\epsilon$

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.