Skip to main content
17 events
when toggle format what by license comment
Feb 27, 2023 at 15:27 comment added Adam Yes, $X_i = \pi_i \circ X$ looks right to me. @guest1
Feb 27, 2023 at 14:55 comment added guest1 So i have read a bit more about multivariate random variables and in one book i found: $X=(X_1, ..., X_n)$, then $X\circ \pi_i =X_i$. But it should probably be $\pi_i\circ X =X_i$, no? Where $\pi_i$ is the i-th coordinate projection
Feb 27, 2023 at 9:45 vote accept guest1
Feb 27, 2023 at 9:34 comment added Adam Yes, the $X_i$ are all measurable functions $\mathbb R^k \to \mathbb R$. @guest1
Feb 27, 2023 at 9:31 comment added guest1 hm okay but then at least their codomain, i.e., where they are mapping to should be $\mathbb{R},\mathcal{B}$, otherwise it wouldnt be a k dimensional random vector in the end right?
Feb 27, 2023 at 9:29 comment added Adam Independence does depend on the measure. Events $A$ and $B$ are independent if $\mathbb P(A \cap B) = \mathbb P(A) \mathbb P(B)$. Whether this is true depends on $\mathbb P$. Your random variables $X_i$ are all defined on the same space; as the author tells you, it's $\mathbb R^k$. Unless otherwise specified, all random variables are assumed to be defined on a single probability space. @guest1
Feb 27, 2023 at 9:26 comment added guest1 continued: So he seems indeed to assume that each $X_i$ is defined on a "1D" space $(\mathbb{R}, \mathcal{B})$, no?
Feb 27, 2023 at 9:23 comment added guest1 So what still confuses me is that i am looking at a definition of a random vector in a book about probability theory, and there it starts with let $X=(X_1, X_2, ..., X_k)$ be a random vector on $(\mathbb{R}^k, \mathcal{B}^k)$. So this tells me that each of the $k$ scalar valued random variables $X_i$ are defined on only $(\mathbb{R}, \mathcal{B})$, i.e., on a different space right? But then later he writes the following (with $A\in \mathcal{B}^1, k=2$): $\mathbb{P}_{X_1}(A)=\mathbb{P}(X_1^{-1}(A)) = \mathbb{P}(X_1^{-1}(A)\cap X_2^{-1}\mathbb{R})) = \mathbb{P}_X(A\times \mathbb{R})$
Feb 27, 2023 at 9:16 comment added guest1 So I mean independence does not depend on the measure, no? So two random variables are independent if their joint distribution is equal to the product measure of their marginals, right?
Feb 27, 2023 at 9:00 comment added Adam Yes, that's what I mean by projection, and $\mu \otimes \nu$ is the product measure, which can always be defined. In my construction, $X'$ and $Y'$ are independent, but they could be dependent if you choose a different measure on $U \times V$. (If $X$ and $Y$ are real, you could e.g. construct dependent $X'$ and $Y'$ using a copula.) @guest1
Feb 27, 2023 at 8:47 comment added guest1 thank you! So by projection you mean that the mapping of the random variable still remains the same but we change the domain (where we can however, ignore the inputs that are not needed for the definition of our mapping)? And a question to the new space that you have defined in your first comment. Is $\mu \otimes \nu$ the product measure? Can we always define it when we create a new space like that? And does that mean that any joint distribution of $X'$ and $Y'$ will be independent?
Feb 27, 2023 at 8:28 comment added Adam You can always go from a variable $X'$ on $U \times V$ to a variable $X$ on $U$ by integrating out $V$; if the original variable $X'$ was a projection to $U$, then this will have the same effect as ignoring $V$. But $X$ is not $X'$. One is defined on $U \times V$, the other is defined on $U$. @guest1
Feb 27, 2023 at 8:16 comment added Adam Given $X$ defined on $(U, \mathcal U, \mu)$, and $Y$ defined on $(V, \mathcal V, \nu)$, you can always define $X'$ and $Y'$ on $(U \times V, \mathcal U \otimes \mathcal V, \mu \otimes \nu)$ by projection. If $X':U\times V \to U$ is a projection, then it is still defined on $U \times V$; you've misunderstood what "defined on" means. @guest1
Feb 27, 2023 at 8:12 comment added guest1 Maybe the opposite question is also interesting: If I have a joint distribution of some random variables $X$ and $Y$ on a common space $(\mathbb{R}^{k+l}, \mathcal{B}^{k+l})$, does it mean that $X$ and $Y$ will always also be defined on that common space $(\mathbb{R}^{k+l}, \mathcal{B}^{k+l})$? But what happens if $X$ is defined as $X=id_{\mathbb{R}^k}$ and $Y=id_{\mathbb{R}^l}$, i.e., they are the $k$ and $l$ dimensional identity map, respectively? Wouldnt that imply that they are actually defined on $(\mathbb{R}^{k}, \mathcal{B}^{k})$ and $(\mathbb{R}^{l}, \mathcal{B}^{l})$, respectivaely?
Feb 27, 2023 at 8:06 comment added guest1 thank you for your answer. I have still some questions: So if I have two random variables $X$ and $Y$ and they are defined on two different spaces $(\mathbb{R}^k, \mathcal{B}^k)$ and $(\mathbb{R}^l, \mathcal{B}^l)$ can I always pretend as if they were defined on a common space $(\mathbb{R}^{k+l}, \mathcal{B}^{k+l})$? If so, does this only hold for spaces over the reals with borel sigma-algebra? And does a joint distribution then always exist?
Feb 25, 2023 at 14:48 history edited Adam CC BY-SA 4.0
added 18 characters in body
Feb 25, 2023 at 14:35 history answered Adam CC BY-SA 4.0