0
$\begingroup$

I have some doubt about the following exercise.

Let's consider the signal $X(t)= \operatorname{rect}\left(\frac{t}{2A}\right) $ , where $A$ is a discrete random variable which can assume one value between ${2,4,8}$ with equal probability and $$\operatorname{rect}\left(\frac{t}{2A}\right)= \begin{cases} 1 , & > \text{if} -A < t < A \newline 0, & \text{otherwise} \end{cases}.$$ Find:

  1. The pmf of X(t) ;
  2. The Energy spectral density (ESD) ;
  3. The autocorrelation function.

What I did is

  1. $p(x;t) = \begin{cases} \operatorname{rect}(\frac{t}{4}) , Pr(A=2)= \frac{1}{3} \newline \operatorname{rect}(\frac{t}{8}) , Pr(A=4) = \frac{1}{3} \newline \operatorname{rect}(\frac{t}{16}) ,Pr(A=8)= \frac{1}{3} \end{cases}$

  2. $\mathcal{E}_x(f)= E[X^2(t)] = \frac{1}{3} [\operatorname{rect}(\frac{t}{4}) + \operatorname{rect}(\frac{t}{8}) + \operatorname{rect}(\frac{t}{16})] $

  3. The autocorrelation function of the random process is $$R_x(t. \tau) = E[X(t)X(t- \tau)] = E\left[\operatorname{rect}\left( \frac{t}{2A}\right) \operatorname{rect}\left(\frac{t- \tau}{2A}\right)\right].$$ How do I go on from here?

Edit: If we reason in terms of "events" Ev, when is $X(t)=1$ ? We have that:

  1. if $ |t| \leq 2 , P(X(t)=1)=P(X(t)=1 | A=2 \cup A=4 \cup A=8) =1 $

  2. if $2<|t| \leq 4 , P(X(t)=1)=P(X(t)=1 | A=4 \cup A=8) = \frac{2}{3} $

  3. if $4<|t| \leq 8, P(X(t)=1)=P(X(t)=1 | A=8) = \frac{1}{3} $

  4. if $|t| \geq 8, P(X(t)=1)=P(X(t)=1 | \emptyset) = 0 $ So, if I want the pmf of $X(t)$, I can simply pick the pmf of one of the Bernoulli variables? Can I not characterize $X(t)$ as a whole?

In regards to the autocorrelation function, $R_X(t,s)= E[X(t)X(s)]$ : if (t,s) are part of the same interval (i.e. B,C,D,E), the autocorrelation function is equal to the mean square value of $X(t)$ , while if they're part of different intervals the ACF is equal to the product of the means, am I correct?

$\endgroup$
6
  • $\begingroup$ If you want to write a single formula for the pmf of $X(t)$ that applies for all $t$, go for it. It will be a mess to read and understand and I refuse to help you in the least in constructing this monstrosity.<br> It would probably help the reader if you define what the sets $B, C, D, E$, are in your question itself.<br> You are not correct in what you ask in the last sentence of your question. Read my answer very carefully; it specifies exactly what the ACF is, and it is not the product of the means in general. $\endgroup$ Commented May 11, 2023 at 3:08
  • $\begingroup$ Are you sure that $X(t)$ can be a Bernoulli random variable? $X(t)$ is a continuous random variable, while a bernoullian one is discrete. $\endgroup$ Commented May 11, 2023 at 11:34
  • $\begingroup$ Yes, I am sure that for each and every choice of real number $t$, $X(t)$ is a discrete random variable; in. fact a Bernoulli random variable. It is $t$ that is a continuous random variable. Your question calls the whole of DSP into question because you seem to be insisting that all functions of continuous variable $t$ are themselves necessarily continuous functions, thus denying the existence of things like the rect function which you yourself are using, and square waves etc. $\endgroup$ Commented May 11, 2023 at 13:32
  • $\begingroup$ If two random variables are independent, isn't their correlation function equal to the product of their means? If $t,s$ are part of different intervals, $X(t)$ and $X(s)$ should be independent. So we also have that the ACF is equal to $\frac{2}{9}$ , no? $\endgroup$ Commented May 16, 2023 at 15:48
  • $\begingroup$ Please define exactly what you mean by correlation because with the commonly accepted meaning of the word "correlation" with regard to random variables, it is not true that the correlation of independent random variables equals the product of the means: independent random variables are uncorrelated meaning that their correlation is $0$ regardless of what their means are. Also $X(t)$ and $X(s)$ are not independent: even if $t$ and $s$ belong to different intervals. The second part of my answer spent quite some time explaining why. Please read the answer carefully. $\endgroup$ Commented May 16, 2023 at 18:15

1 Answer 1

3
$\begingroup$

The OP's working is incorrect, and proceeding further from what is included in the OP's question won't help in the least.

A random process is a collection of random variables all having a family name such as $X$ and being identified within the family by subscripts such as $X_t$ or arguments such as $X(t)$. In the OP's (homework?) problem, the (infinitely many) random variables $X(t)$ all are discrete random variables, in fact, Bernoulli random variables with different parameters. Since $A$ is a discrete random variable taking on values $2,4,8$ with equal probability $\frac 13$, $X(t)= \operatorname{rect}\left(\frac{t}{2A}\right)$ has value $1$ for $t \in (-2,2)$, for $t \in (-4,4)$, and $t\in (-8,8)$ respectively, with equal probability $\frac 13$; and $X(t)$ has value $0$ if $|t| \geq 8.$ Let's look at this information a little more closely.

Define sets $B, C, D, E$ of real numbers as follows: \begin{align} B &= \{t\colon |t| \geq 8\},\\ C &= \{t\colon 4 \leq |t| < 8\}.\\ D &= \{t\colon 2 \leq |t| < 4\},\\ E &= \{t\colon |t| < 2\} \end{align} and note that every real number belongs to one and only one of the sets $B,C,D,E$. Corresponding to these sets of real numbers, define real numbers (parameters) $p_B, p_C, p_D, p_E$ as $0,\frac 13,\frac 23, 1$ respectively.

Now, suppose that $t$ is a real number in set $C$. What can we say about $X(t)$? Well, clearly, $X(t) = \operatorname{rect}\left(\frac{t}{2A}\right)$ has value $1$ if and only if $A=8$ which event has probability $\frac 13$ of occurring. We conclude that

For all $t \in C$, that is, for all $t$ such that $4 \leq |t| < 8$, $X(t)$ is a Bernoulli random variable with parameter $p_C = \frac 13$.

Next, suppose that $t$ is a real number in set $D$. What can we say about $X(t)$? Well, clearly, $X(t) = \operatorname{rect}\left(\frac{t}{2A}\right)$ has value $1$ if and only if $A=8$ or $A=4$ which event has probability $\frac 23$ of occurring. We conclude that

For all $t \in D$, that is, for all $t$ such that $2 \leq |t| < 4$, $X(t)$ is a Bernoulli random variable with parameter $p_D = \frac 23$.

If $t \in E$, that is, $|t| < 2$, then $X(t)$ has value $1$ regardless of what value $A$ has; it is a constant! But we fit this into the Procrustean bed of Bernoulli random variables by saying

For all $t \in E$, that is, for all $t$ such that $|t| < 2$, $X(t)$ is a Bernoulli random variable with parameter $p_E = 1$.

Similarly, if $t \in B$, that is, $|t| \geq 8$, then $X(t)$ has value $0$ regardless of what value $A$ has. We fit this also into our Procrustean bed of Bernoulli random variables by saying

For all $t \in B$, that is, for all $t$ such that $|t| \geq 8$, $X(t)$ is a Bernoulli random variable with parameter $p_B = 0$.

Summarizing, $p_X(x;t)$, the pmf of $X(t)$, is the pmf of a Bernoulli random variable with parameter respectively $p_B, p_C, p_D,p_E$ (a.k.a. $0,\frac 13,\frac 23, 1$) according as $t$ is in sets $B, C, D, E$. Note that each of the four possible pmfs for $X(t)$ is a valid pmf: the values of $P(X(t)= 1)$ and $P(X(t)=0)$ sum to $1$ ($0+1=1, \frac 23 + \frac 13 = 1, \frac 13 + \frac 23 = 1, 1+0 = 1$ as the case may be) but I cheerfully admit that $0,\frac 13,\frac 23, 1$ don't add up to $1$. Why should they? Those numbers are just the four possible parameters of the Bernoulli pmf, not the pmf values!

It is worth emphasizing that the random variables $X(t)$ are very dependent, in fact, identical over various intervals. For example, all $X(t)$ for $t \in C$ must take on the same value ($0$ or $1$ as the case may be); it is not possible for, say $X(5.1)$, to have value $1$ while $X(6)$ has value $0$. Even worse, if the random variables $\{X(t) \colon 4 \leq |t| < 8\}$ all have value $1$, then the random variables $\{X(t)\colon 2 \leq |t| < 4\}$ also must all have value $1$ while if $\{X(t)\colon 4 \leq |t| < 8\}$ all have value $0$, then $\{X(t)\colon 2 \leq |t| < 4\}$ could have value either $0$ or $1$. Thus, any $X(t\colon t \in C)$ and any $X(t\colon t \in D)$ are not independent random variables.

Now, since all the $X(t\colon t \in E)$ are degenerate random variables that take on value $1$ with probability $1$ (and thus are identical too), it must be admitted that they also are independent random variables, since for $t_1, t_2 \in E$, $$P(X(t_1)=1,X(t_2)=1)= 1 = P(X(t_1)=1)P(X(t_2)=1).$$ Heck, since we haven't excluded the possibility that $t_1=t_2$, $X(t_1)$ is independent of itself too! Similarly, though the $X(t\colon t \in B)$ are degenerate random variables that take on value $0$ with probability $1$ (and thus are identical too), it must be admitted that they also are independent random variables, since for $t_1, t_2 \in E$, $$P(X(t_1)=0,X(t_2)=0)= 1 = P(X(t_1)=0)P(X(t_2)=0).$$ Similar arguments can be made about $X(t\colon t \in E)$ and $X(t\colon t \in B\cup C\cup D)$, and about $X(t\colon t \in B)$ and $X(t\colon t \in C\cup D \cup E)$, but it is not possible to claim (as the OP wishes to do) that in all cases, random variables in different intervals are independent. The counterexample is provided in the previous paragraph: any $X(t\colon t \in C)$ and any $X(t\colon t \in D)$ are not independent random variables.


Turning to the autocorrelation function $R_X(t,s) = \mathbb E[X(t)X(s)]$, we need to evaluate this expectation for all pairs of real numbers $(t,s)$ where $t$ and $s$ each belong to $B$ or $C$ or $D$ or $E$. There are $16$ possible cases to consider, Let's denote by $G$ the set ($B$ or $C$ or $D$ or $E$) to which $t$ belongs; and by $H$ the set ($B$ or $C$ or $D$ or $E$) to which $s$ belongs.

  • If either $G$ or $H$ is $B$ (possibly both are $B$), then at least one of $X(t)$ and $X(s)$ is $0$ and so $R_X(t,s) = \mathbb E[X(t)X(s)] = \mathbb E[0] = 0$. Note that this is saying that $R_X(t,s)$ has value $0$ whenever at least one of $t$ and $s$ has magnitude $8$ or more, that is, the point $(t,s)$ lies on or outside the boundary of the $16\times 16$ square centered at the origin (and sides parallel to the axes). Thus, the nonzero values of $R_X(t,s)$ occur only in the interior of this $16\times 16$ square. Note also that one way of specifying the value of $R_X(t,s)$ in this region is $R_X(t,s) = \min(p_G, p_H)$ since at least one of $G$ and $H$ is $B$ and so at least one of $p_G, p_H$ is $0$.
  • If both $G$ and $H$ are $E$, then both $X(t)$ and $X(s)$ are $1$ and so $$R_X(t,s) = \mathbb E[1] = 1 = \min(p_G, p_H),$$ Thus, $R_X(t,s) = 1$ if $(t,s)$ lies in the interior of the $4\times 4$ square centered to the origin.
  • Next, consider the hollow-square region consisting of all points $(t,s)$ lying in the interior of the $8\times 8$ square but on or outside the boundary of the $4\times 4$ square. Here, at least one of $G$ and $H$ is $D$ (the other can be $D$ or $E$.) If the other is $E$, then one of $X(t), X(s)$ has value $1$ and so $\mathbb E[X(t)X(s)]= p_D$. If both $G$ and $H$ are $D$, then $X(t)X(s)$ has value $1$ with probability $p_D$. Thus, in this hollow-square region also, we have that $$\mathbb E[X(t)X(s)]= p_D = \frac 23 = \min(p_G, p_H).$$
  • Finally, consider the hollow-square region consisting of all points $(t,s)$ lying in the interior of the $16\times 16$ square but on or outside the boundary of the $8\times 8$ square. Here, at least one of $G$ and $H$ is $C$ (the other can be $C$ or $D$ or $E$.) Using the same type of arguments as above, we get that $$\mathbb E[X(t)X(s)]= p_C = \frac 13 = \min(p_G, p_H).$$

In summary, $R_X(t,s)$ is nonzero only in the interior of the $16\times 16$ square with sides parallel to the axes and center the origin.
In the $4\times 4$ central square, $R_X(t,s)$ has constant value $1$.
In the hollow-square region consisting of the $8\times 8$ square less the $4\times 4$ square, $R_X(t,s)$ has constant value $\frac 23$.
In the hollow-square region consisting of the $16\times 16$ square less the $8\times 8$ square, $R_X(t,s)$ has constant value $\frac 13$.

$\endgroup$
2
  • $\begingroup$ Hello, thank you for you answer. I understand the first half of your answer but not the second half. The pmf of a discrete random variable is the sum of masses of probability and the total is equal to 1, right? $\endgroup$ Commented May 9, 2023 at 10:21
  • $\begingroup$ Sorry about the bad notation. I have revised the notation and the answer thoroughly and hope that it makes more sense now. I will complete the autocorrelation stuff later today. $\endgroup$ Commented May 9, 2023 at 17:56

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.