1
$\begingroup$

Let $X$ be a measurable space. For each $n\in\mathbb N$, let $P_n$ and $Q_n$ be probabilities on $X$. We say that $(P_n)_{n\in\mathbb N}$ and $(Q_n)_{n\in\mathbb N}$ are statistically indistinguishable iff for all measurable set $E\subseteq X$, the function \begin{equation} n\mapsto |P_n(E) - Q_n(E)| \end{equation} is negligible.

But what if we allow "randomness"? Let's say that $(P_n)_{n\in\mathbb N}$ and $(Q_n)_{n\in\mathbb N}$ are randomly statistically indistinguishable (I just made up this terminology) iff for all measurable space $Y$, all probability family $(R_n)_{n\in\mathbb N}$ on $Y$, and all measurable set $E\subseteq X\times Y$, the function \begin{equation} n\mapsto |(P_n\times R_n)(E) - (Q_n\times R_n)(E)| \end{equation} is negligible.

Random statistical indistinguishability clearly implies statistical indistinguishability. But is the converse true?

$\endgroup$
3
  • $\begingroup$ Possible duplicate of crypto.stackexchange.com/questions/73108/… $\endgroup$ Commented Mar 15, 2022 at 2:55
  • $\begingroup$ From my reading it seems like you are asking that if two probability distributions have marginals that are close, it implies the distributions are close (which is clearly false). Am I misunderstanding something? $\endgroup$ Commented Mar 15, 2022 at 3:08
  • $\begingroup$ Cross-posted with math.se $\endgroup$ Commented Mar 15, 2022 at 22:40

1 Answer 1

2
$\begingroup$

Big caveat that I'm not a probabilist, and your answer really doesn't include much cryptography, so might be better suited for asking a probabilist somewhere (say on math.se or something).

As mentioned in the comments, this is easily false. Let $P_n, Q_n$ both be distributed as any symmetric distribution, and let $R_n\sim \{-1,1\}$ be uniform. Define the joint distributions $P_n\times R_n$ and $Q_n\times R_n$ as follows --- the marginals on both $X$ and $Y$ are fixed as above, but

$$\Pr[(P_n\times R_n)\in E] = \Pr[(P_n\times R_n)\in E\mid R_n = b]\Pr[R_n = b].$$

Now, as we are discussing symmetry, write $X = X_1\cup X_{-1}$. Assume the symmetry swaps these two components. We now define the conditional distributions

$$\Pr[(P_n\times R_n)\in E\mid R_n = b] = \begin{cases}0 & E\cap X_{b}\neq \emptyset \\ 2\Pr[P_n(E_X)]&\text{else} \end{cases}.$$

This is to say the conditional distribution is defined such that a random variable with $R_n = b$ is in $X_b$, e.g. the components $P_n, R_n$ are "perfectly correlated". For $Q_n$, do the same, but reverse the roles of $X_1, X_{-1}$, e.g. have $Q_n, R_n$ be "perfectly anti-correlated".

It is straightforward to see these random variables have identical marginals, and are therefore perfectly indistinguishable (and statistically indistinguishable as well). It is also straightforward to see that the joint distributions $P_n\times R_n$ and $Q_n\times R_n$ have disjoint supports, so

$$0 = \Delta(P_n, Q_n) \leq \Delta(P_n\times R_n, Q_n\times R_n) = 1,$$

and therefore they are not randomly statistically indistinguishable.

Note that if you assume $P_n, R_n$ are independent (in your language $E$ factors as $E_X\times E_Y$ I think), the answer is easily true. As a sketch of the proof, by the data-processing inequality we have that $\Delta(P_n, Q_n) \geq \Delta(f(P_n), f(Q_n))$ for any randomized $f$, including $f : X\to X\times Y$ that samples $R_n$ independently, and outputs $f(x) = (x, R_n)$. This isn't what you asked, but its still useful to note.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.