Skip to main content
added 76 characters in body
Source Link
ckamath
  • 5.5k
  • 2
  • 25
  • 42

The proof (courtesy of M. Skorski) relies on the so-called "smoothing" technique [Cac97,RW,Sko15].

For some $0<\epsilon<1$, let $B$ be the subset of heaviest points in $X$ whose mass add up to $\epsilon$ -- i.e.: $$\sum_{x\in B} p_X(x)=\epsilon, \text{ and } \min_{x\in B}p_X(x)\geq\max_{x\in\bar{B}}p_X(x).$$ Let $X'$ denote the conditional distribution of $X$ on $\bar{B}$ -- i.e.: $$ p_{X'}(x):= \begin{cases} 0 & x\in B,\\ \frac{p_{X}(x)}{(1-\epsilon)} & x\in\bar{B}. \end{cases} $$ Note that the statistical distance between $X$ and $X'$, $\Delta(X,X')=\epsilon$.$^1$ Let $m:=\max_{x\in\bar{B}}p_X(x)$. Now, let's consider the collision entropy of $X$: $$H_2(X):=-\log\sum_{x\in X} p_X^2(x)\leq-\log\sum_{x\in B} p_X^2(x)\leq^2-\log (m\epsilon).$$ It follows that $-\log(m)\geq H_2(X)-\log(1/\epsilon)$, and the results follows as $H_\infty(X')=-\log(m/(1-\epsilon))$.

A similar argument can be made for a joint distribution too.

References: [Cac97]: C. Cachin. Smooth Entropy and Renyi Entropy, EUROCRYPT'97. [RW]: R. Renner and S. Wolf. Smooth Renyi Entropy and its Applications. [Sko15]: M. Skorski. How to Smooth Entropy? SOFSEM'16

Footnotes:

  1. Proof: \begin{align} \Delta(X,X') &=\frac{1}{2}\cdot\sum_{x\in X}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\sum_{x\in B}\left|p_X(x)\right|+\sum_{x\in \bar{B}}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\left(\epsilon+ \sum_{x\in \bar{B}}p_X(x)\left|1-\frac{1}{1-\epsilon}\right|\right)\\ &=\frac{1}{2}\cdot\left(\epsilon+ (1-\epsilon)\left|1-\frac{1}{1-\epsilon}\right|\right)=\epsilon \end{align}
  2. Proof: $\sum_{x\in B} p_X^2(x)\geq \sum_{x\in B}m\cdot p_X(x)= m\cdot\epsilon$

The proof (courtesy of M. Skorski) relies on the so-called "smoothing" technique [Cac97,Sko15].

For some $0<\epsilon<1$, let $B$ be the subset of heaviest points in $X$ whose mass add up to $\epsilon$ -- i.e.: $$\sum_{x\in B} p_X(x)=\epsilon, \text{ and } \min_{x\in B}p_X(x)\geq\max_{x\in\bar{B}}p_X(x).$$ Let $X'$ denote the conditional distribution of $X$ on $\bar{B}$ -- i.e.: $$ p_{X'}(x):= \begin{cases} 0 & x\in B,\\ \frac{p_{X}(x)}{(1-\epsilon)} & x\in\bar{B}. \end{cases} $$ Note that the statistical distance between $X$ and $X'$, $\Delta(X,X')=\epsilon$.$^1$ Let $m:=\max_{x\in\bar{B}}p_X(x)$. Now, let's consider the collision entropy of $X$: $$H_2(X):=-\log\sum_{x\in X} p_X^2(x)\leq-\log\sum_{x\in B} p_X^2(x)\leq^2-\log (m\epsilon).$$ It follows that $-\log(m)\geq H_2(X)-\log(1/\epsilon)$, and the results follows as $H_\infty(X')=-\log(m/(1-\epsilon))$.

A similar argument can be made for a joint distribution too.

References: [Cac97]: C. Cachin. Smooth Entropy and Renyi Entropy, EUROCRYPT'97. [Sko15]: M. Skorski. How to Smooth Entropy? SOFSEM'16

Footnotes:

  1. Proof: \begin{align} \Delta(X,X') &=\frac{1}{2}\cdot\sum_{x\in X}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\sum_{x\in B}\left|p_X(x)\right|+\sum_{x\in \bar{B}}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\left(\epsilon+ \sum_{x\in \bar{B}}p_X(x)\left|1-\frac{1}{1-\epsilon}\right|\right)\\ &=\frac{1}{2}\cdot\left(\epsilon+ (1-\epsilon)\left|1-\frac{1}{1-\epsilon}\right|\right)=\epsilon \end{align}
  2. Proof: $\sum_{x\in B} p_X^2(x)\geq \sum_{x\in B}m\cdot p_X(x)= m\cdot\epsilon$

The proof (courtesy of M. Skorski) relies on the so-called "smoothing" technique [Cac97,RW,Sko15].

For some $0<\epsilon<1$, let $B$ be the subset of heaviest points in $X$ whose mass add up to $\epsilon$ -- i.e.: $$\sum_{x\in B} p_X(x)=\epsilon, \text{ and } \min_{x\in B}p_X(x)\geq\max_{x\in\bar{B}}p_X(x).$$ Let $X'$ denote the conditional distribution of $X$ on $\bar{B}$ -- i.e.: $$ p_{X'}(x):= \begin{cases} 0 & x\in B,\\ \frac{p_{X}(x)}{(1-\epsilon)} & x\in\bar{B}. \end{cases} $$ Note that the statistical distance between $X$ and $X'$, $\Delta(X,X')=\epsilon$.$^1$ Let $m:=\max_{x\in\bar{B}}p_X(x)$. Now, let's consider the collision entropy of $X$: $$H_2(X):=-\log\sum_{x\in X} p_X^2(x)\leq-\log\sum_{x\in B} p_X^2(x)\leq^2-\log (m\epsilon).$$ It follows that $-\log(m)\geq H_2(X)-\log(1/\epsilon)$, and the results follows as $H_\infty(X')=-\log(m/(1-\epsilon))$.

A similar argument can be made for a joint distribution too.

References: [Cac97]: C. Cachin. Smooth Entropy and Renyi Entropy, EUROCRYPT'97. [RW]: R. Renner and S. Wolf. Smooth Renyi Entropy and its Applications. [Sko15]: M. Skorski. How to Smooth Entropy? SOFSEM'16

Footnotes:

  1. Proof: \begin{align} \Delta(X,X') &=\frac{1}{2}\cdot\sum_{x\in X}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\sum_{x\in B}\left|p_X(x)\right|+\sum_{x\in \bar{B}}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\left(\epsilon+ \sum_{x\in \bar{B}}p_X(x)\left|1-\frac{1}{1-\epsilon}\right|\right)\\ &=\frac{1}{2}\cdot\left(\epsilon+ (1-\epsilon)\left|1-\frac{1}{1-\epsilon}\right|\right)=\epsilon \end{align}
  2. Proof: $\sum_{x\in B} p_X^2(x)\geq \sum_{x\in B}m\cdot p_X(x)= m\cdot\epsilon$
Source Link
ckamath
  • 5.5k
  • 2
  • 25
  • 42

The proof (courtesy of M. Skorski) relies on the so-called "smoothing" technique [Cac97,Sko15].

For some $0<\epsilon<1$, let $B$ be the subset of heaviest points in $X$ whose mass add up to $\epsilon$ -- i.e.: $$\sum_{x\in B} p_X(x)=\epsilon, \text{ and } \min_{x\in B}p_X(x)\geq\max_{x\in\bar{B}}p_X(x).$$ Let $X'$ denote the conditional distribution of $X$ on $\bar{B}$ -- i.e.: $$ p_{X'}(x):= \begin{cases} 0 & x\in B,\\ \frac{p_{X}(x)}{(1-\epsilon)} & x\in\bar{B}. \end{cases} $$ Note that the statistical distance between $X$ and $X'$, $\Delta(X,X')=\epsilon$.$^1$ Let $m:=\max_{x\in\bar{B}}p_X(x)$. Now, let's consider the collision entropy of $X$: $$H_2(X):=-\log\sum_{x\in X} p_X^2(x)\leq-\log\sum_{x\in B} p_X^2(x)\leq^2-\log (m\epsilon).$$ It follows that $-\log(m)\geq H_2(X)-\log(1/\epsilon)$, and the results follows as $H_\infty(X')=-\log(m/(1-\epsilon))$.

A similar argument can be made for a joint distribution too.

References: [Cac97]: C. Cachin. Smooth Entropy and Renyi Entropy, EUROCRYPT'97. [Sko15]: M. Skorski. How to Smooth Entropy? SOFSEM'16

Footnotes:

  1. Proof: \begin{align} \Delta(X,X') &=\frac{1}{2}\cdot\sum_{x\in X}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\sum_{x\in B}\left|p_X(x)\right|+\sum_{x\in \bar{B}}\left|p_X(x)-p_{X'}(x)\right|\\ &=\frac{1}{2}\cdot\left(\epsilon+ \sum_{x\in \bar{B}}p_X(x)\left|1-\frac{1}{1-\epsilon}\right|\right)\\ &=\frac{1}{2}\cdot\left(\epsilon+ (1-\epsilon)\left|1-\frac{1}{1-\epsilon}\right|\right)=\epsilon \end{align}
  2. Proof: $\sum_{x\in B} p_X^2(x)\geq \sum_{x\in B}m\cdot p_X(x)= m\cdot\epsilon$