0
$\begingroup$

Suppose I have three mutually independent random variables $X_1$, $X_2$, $X_3$. Let $Y_1 = X_1-X_3$, and let $Y_2 = X_2-X_3$. Are $Y_1$ and $Y_2$ are dependent or independent?

Edit 1: If this is not true in general, what extra assumptions would need to be added to guarantee independence of $Y_1$ and $Y_2$?

Edit 2: Are there generalizations of this concept to a collection of $n$ mutually independent random variables? By this I mean the following: take $X_1, \dots, X_n$ mutually independent random variables $(n\in \mathbb{N}, n>3)$, and let $Y_i = X_i - X_n$, $i\in S = \{1,\dots,n-1\}$. Under which conditions would the $\{Y_i\}_S$ be mutually independent?

$\endgroup$

1 Answer 1

1
$\begingroup$

They may be dependent. Put $X_1 = X_2 = 0$, $X_3$ - Bernoulli random Variable.

There's a notion "n independent random variables", which may be found in every book on probability.

Addition: if $Y_1$ and $Y_2$ are independent then the characteristic function of $(X_1 - X_3, X_2 - X_3)$ is

$$Ee^{i \bigl( a(X_1-X_3) + b(X_2-X_3) \bigr) } = Ee^{i a(X_1-X_3)} \cdot Ee^{i b(X_2-X_3)} \ \ \ (\star)$$ As $X_1, X_2, X_3$ are independent left-hand side of ($\star$) is $$Ee^{i \bigl( a X_1 + b X_2 -(a+b) X_3 \bigr) }=Ee^{i a X_1} Ee^{i b X_2} Ee^{i ( -(a+b) X_3)}$$ and right-hand side of $(\star)$ is $$Ee^{i a X_1 } Ee^{i (-a)X_3} Ee^{i b X_2} Ee^{i (-b)X_3}.$$ Thus $$Ee^{i a X_1} Ee^{i b X_2} Ee^{i ( -(a+b) X_3)} = Ee^{i a X_1 } Ee^{i (-a)X_3} Ee^{i b X_2} Ee^{i (-b)X_3}.$$ Suppose that $Ee^{i a X_1} = 0$ only for $a$ from a set of measure $0$. Also suppose that $Ee^{i b X_2} = 0$ only for $b$ from a set of measure $0$. These assumptions holds true for all popular distributions. We have $Ee^{i ( -(a+b) X_3)} = Ee^{i (-a)X_3} Ee^{i (-b)X_3}$ for almost all $a$ and $b$. Characteristic functions are continious, hence $Ee^{i ( -(a+b) X_3)} = Ee^{i (-a)X_3} Ee^{i (-b)X_3}$ for all $a$ and $b$. Put $t = -a, s = -b$. Thus $$Ee^{i (t+s) X_3} = Ee^{i t X_3} Ee^{i s X_3}$$ for all $t$ and $s$. We got that the characteristic function of a vector $(X_3, X_3)$ is equal to the product of characteristic functions of it's components. Hence $X_3$ and $X_3$ are independent and hence $X_3 = const$.

If $X_3 = const$ then $Y_1$ and $Y_2$ are obviously independent.

So the condition that you need is the condition $X_n = const$.

Remark: there are characteristic functions which are equal to $0$ on the set of positive measure, e.g. $(1-|t|)I_{|t| < 1}$. It follows from Polya theorem about characteristic functions.

$\endgroup$
4
  • $\begingroup$ thank you for your response. please see edits to the question that clarify its intent. $\endgroup$ Commented Oct 15, 2021 at 20:12
  • $\begingroup$ @UlisesNunez, I made an addition. $\endgroup$ Commented Oct 15, 2021 at 22:05
  • $\begingroup$ Thanks very much. I appreciate the added detail. $\endgroup$ Commented Oct 16, 2021 at 23:29
  • $\begingroup$ You begin with $X_1 = X_2 = 0$, but this contradicts the OP's requirement that $X_1, X_2, X_3$ should be mutually independent. $\endgroup$ Commented May 3 at 21:46

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.