Skip to main content
deleted 7 characters in body
Source Link
Arnaud
  • 884
  • 7
  • 20

Suppose thatConsider $N\sim\mathcal N(0,1)$ is a standard normal random variable. Can we find a dependent random variable $D$, such that the random variable $M=N+D$ is still a standard normal distribution? I would also like to choose the variance of D that would be typically smaller than 1. $D$ would represent a kind of random noise that do not change the random distribution when added.

If $D$ is an independent distribution with average 0, the average is still $0$ but the variance of $D$ is added to that of $N$. This property is not true when the random variables are correlated. We might determine $D$ with its conditional distribution : $f(D | N)$

I have no idea how to solve this problem.

Suppose that $N\sim\mathcal N(0,1)$ is a standard normal random variable. Can we find a dependent random variable $D$, such that the random variable $M=N+D$ is still a standard normal distribution? I would also like to choose the variance of D that would be typically smaller than 1. $D$ would represent a kind of random noise that do not change the random distribution when added.

If $D$ is an independent distribution with average 0, the average is still $0$ but the variance of $D$ is added to that of $N$. This property is not true when the random variables are correlated. We might determine $D$ with its conditional distribution : $f(D | N)$

I have no idea how to solve this problem.

Consider $N\sim\mathcal N(0,1)$ a standard normal random variable. Can we find a dependent random variable $D$, such that the random variable $M=N+D$ is still a standard normal distribution? I would also like to choose the variance of D that would be typically smaller than 1. $D$ would represent a kind of random noise that do not change the random distribution when added.

If $D$ is an independent distribution with average 0, the average is still $0$ but the variance of $D$ is added to that of $N$. This property is not true when the random variables are correlated. We might determine $D$ with its conditional distribution : $f(D | N)$

I have no idea how to solve this problem.

deleted 6 characters in body
Source Link
Aaron Hendrickson
  • 6.4k
  • 2
  • 18
  • 51

Suppose that N$N\sim\mathcal N(0,1)$ is a standard normal distribution (average 0 and variance 1)random variable. Can we find a dependent random variable D$D$, such that the random variable M=N+D$M=N+D$ is still a standard normal distribution? I would also like to choose the variance of D that would be typically smaller than 1. D$D$ would represent a kind of random noise that do not change the random distribution when added.

If D$D$ is an independent distribution whithwith average 0, the average is still 0$0$ but the variance of D$D$ is added to that of N$N$. This property is not true when the random variables are correlated. We might determine D$D$ with its conditional distribution : f(D | N)$f(D | N)$

I have no idea how to solve this problem.

Suppose that N is a standard normal distribution (average 0 and variance 1). Can we find a dependent random variable D, such that the random variable M=N+D is still a standard normal distribution? I would also like to choose the variance of D that would be typically smaller than 1. D would represent a kind of random noise that do not change the random distribution when added.

If D is an independent distribution whith average 0, the average is still 0 but the variance of D is added to that of N. This property is not true when the random variables are correlated. We might determine D with its conditional distribution : f(D | N)

I have no idea how to solve this problem.

Suppose that $N\sim\mathcal N(0,1)$ is a standard normal random variable. Can we find a dependent random variable $D$, such that the random variable $M=N+D$ is still a standard normal distribution? I would also like to choose the variance of D that would be typically smaller than 1. $D$ would represent a kind of random noise that do not change the random distribution when added.

If $D$ is an independent distribution with average 0, the average is still $0$ but the variance of $D$ is added to that of $N$. This property is not true when the random variables are correlated. We might determine $D$ with its conditional distribution : $f(D | N)$

I have no idea how to solve this problem.

edited title
Link
Arnaud
  • 884
  • 7
  • 20

Adding a dependent random variable to a standard normal distributionvariable without changing its distribution

Source Link
Arnaud
  • 884
  • 7
  • 20
Loading