0
$\begingroup$

I'm a beginner at signal processing and I've gotten the following question in an exercise:

"Write down the equation for a Gaussian probability density distribution and relate the different variables to the autocorrelation function of a Gaussian random signal"

The first part of the question is seems pretty straightforward and I write the Gaussian PDF as: $$ f_{x(t)}(\alpha) = \frac{1}{\sqrt{2 \pi}\sigma_x} \ \ \exp\bigg( - \frac{(\alpha-\mu_x)^2}{2 \sigma_x^2} \bigg) $$

But the second part I don't know how to approach it. How do I find the autocorrelation function $R(\tau)$ of a Gaussian random signal?
Do I find it as: $$ R_{xx}(\tau) = \mathrm{E}\Big\{x(t)x(t+\tau)\Big\} $$ and then calulate the integral? Or is the autocorrelation function known? I could not find the formula for the autocorrelation function anywhere except using chatGPT which I dont really trust.

$\endgroup$
3
  • 2
    $\begingroup$ That feels like a non-sensical question to me unless you also constrain the signal to be "white". Are you sure the question is quoted correctly ? $\endgroup$ Commented Apr 24, 2024 at 12:50
  • $\begingroup$ What is your (or your book's) definition of Gaussian random signal? Also, it would seem that $E[x(t)x(t-\tau)]$ should depend on both $t$ and $\tau$,and not just on $\tau$ alone. I vote to close this question on the grounds that it is unclear what you are asking. $\endgroup$ Commented Sep 21, 2024 at 21:26
  • $\begingroup$ The exercise question is nonsensical in the manner that it is not complete. If $x(t)$ is a Gaussian random process, then the p.d.f. for the random variable $x(t)$ (for some given time $t$) must be as shown above. I think that the mean $\mu_x$ should be zero, otherwise it's just a zero-mean random process with deterministic DC added to it. It's the correct understanding of autocorrelation, for a random process, also. It's a legit question. And the answer below answers it with a specific example of a Markov process that can have any allowable autocorrelation. $\endgroup$ Commented Jan 21 at 18:36

1 Answer 1

-1
$\begingroup$

You're missing some necessary information to get the autocorrelation:

$$ R_{xx}(\tau) = \operatorname{E}\Big\{ x(t)x(t+\tau) \Big\} $$

You have

$$ f_{x(t)}(\alpha) = \frac{1}{\sqrt{2 \pi} \sigma_x} \ \ e^{- \frac{(\alpha-\mu_x)^2}{2 \sigma_x ^2} } $$

That's not enough. You need, additionally, this conditional probability:

$$ f_{x(t+\tau)x(t)}(\alpha,\beta) = f_{x(t+\tau)}\big(\alpha|\beta \big) \cdot f_{x(t)}(\beta)$$

One possibility might be:

$$\begin{align} f_{x(t+\tau)x(t)}(\alpha,\beta) &= f_{x(t+\tau)}\big(\alpha|\beta \big) \cdot f_{x(t)}(\beta) \\ \\ &= \frac{1}{\sqrt{2 \pi \left(\sigma_x^2 - R_{xx}(\tau)\right)}} e^{-\frac12 \frac{\left(\alpha-\beta\sigma_x^{-2}R_{xx}(\tau)\right)^2}{\sigma_x^2 - R_{xx}(\tau)}} \ \cdot \ \frac{1}{\sqrt{2 \pi} \sigma_x} e^{-\frac12 \left(\frac{\beta}{\sigma_x}\right)^2} \\ \end{align}$$

In this answer I show how you go from those dependent probabilities to the autocorrelation function.

$\endgroup$
8
  • 1
    $\begingroup$ This answer is just as nonsensical as the OP's question. You are basically sticking the value of $R_{xx}(\tau)$ -- which is what is to be determined -- as a parameter in the equation for the joint pdf, and so when you do indeed do the calculation to find $E[X(t)X(t+\tau)]$ from the joint pdf, you will end up with the tautology $R_{xx}(\tau) = R_{xx}(\tau)$. $\endgroup$ Commented Jan 19 at 23:18
  • $\begingroup$ I shown, as examples, two different random processes described with a guassian p.d.f. of a conditional p.d.f. making it a Markov process that can have a given autocorrelation and power spectrum. These are white noise shoved through an RC-lowpass filter (Ornstein-Uhlenbeck) and brickwall lowpass filter (bandlimited white noise). I constructed one way how you can get a process with finite variance (so it can't be white noise, which has infinite variance) that would result in a these two particular autocorrelations and power spectra. Then I shown how these sorta become "white" as $\nu\to\infty$. $\endgroup$ Commented Jan 20 at 5:29
  • $\begingroup$ @DilipSarwate do you still think the question is nonsensical? It surely seems to me that the OP is asking for a probabilistic model of a gaussian random process that will have a particular autocorrelation. Like how do you construct such a model? I'd be interested in reading how you would answer this question. $\endgroup$ Commented Jan 21 at 17:55
  • $\begingroup$ Yes, I still think the question is nonsensical. How do the parameters $\mu$ and $\sigma$ of a Gaussian pdf relate to any parameters of a Gaussian process other than by saying something like "If we assume that $X$ is one of the random variables, say $X_t$ of a Gaussian process, then $\mu_X(t) = E[X_t] = \mu$, and $R_X(t,t)\ = \sigma^2+\mu^2$. We cannot say anything about the value $\mu_X(s)$ for any $s \neq t$, or the value of $R_X(s,s)$ or $R_X(t,s)$ without making various additional assumptions such as wide-sense stationarity which are nowhere stated in the problem:" $\endgroup$ Commented Jan 21 at 19:29
  • $\begingroup$ Listen, in my classes (I had more than one) in random processes, we pretty much never had non-zero mean in our Gaussian random processes. We did processes other than Gaussian, but Gaussian processes were always zero mean because, if we wanted a mean, we could just add DC. So forget about $\mu$. Now, I am not sure, but I believe any ergodic process is WSS. (Maybe it's the other way around.) But let's toss in WSS anyway, either way. Now the OP says what Gaussian is (but $\mu$ should be zero) and they want to find out how to get a Gaussian random process that has a particular autocorrelation. $\endgroup$ Commented Jan 22 at 0:57

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.