The OP's working is incorrect, and proceeding further from what is included in the OP's question won't help in the least.
A random process is a collection of random variables all having a family name such as $X$ and being identified within the family by subscripts such as $X_t$ or arguments such as $X(t)$. In the OP's (homework?) problem, the (infinitely many) random variables $X(t)$ all are discrete random variables, in fact, Bernoulli random variables with different parameters. Since $A$ is a discrete random variable taking on values $2,4,8$ with equal probability $\frac 13$, $X(t)= \operatorname{rect}\left(\frac{t}{2A}\right)$ has value $1$ for $t \in (-2,2)$, for $t \in (-4,4)$, and $t\in (-8,8)$ respectively, with equal probability $\frac 13$; and $X(t)$ has value $0$ if $|t| \geq 8.$ Let's look at this information a little more closely.
Define sets $B, C, D, E$ of real numbers as follows: \begin{align} B &= \{t\colon |t| \geq 8\},\\ C &= \{t\colon 4 \leq |t| < 8\}.\\ D &= \{t\colon 2 \leq |t| < 4\},\\ E &= \{t\colon |t| < 2\} \end{align} and note that every real number belongs to one and only one of the sets $B,C,D,E$. Corresponding to these sets of real numbers, define real numbers (parameters) $p_B, p_C, p_D, p_E$ as $0,\frac 13,\frac 23, 1$ respectively.
Now, suppose that $t$ is a real number in set $C$. What can we say about $X(t)$? Well, clearly, $X(t) = \operatorname{rect}\left(\frac{t}{2A}\right)$ has value $1$ if and only if $A=8$ which event has probability $\frac 13$ of occurring. We conclude that
For all $t \in C$, that is, for all $t$ such that $4 \leq |t| < 8$, $X(t)$ is a Bernoulli random variable with parameter $p_C = \frac 13$.
Next, suppose that $t$ is a real number in set $D$. What can we say about $X(t)$? Well, clearly, $X(t) = \operatorname{rect}\left(\frac{t}{2A}\right)$ has value $1$ if and only if $A=8$ or $A=4$ which event has probability $\frac 23$ of occurring. We conclude that
For all $t \in D$, that is, for all $t$ such that $2 \leq |t| < 4$, $X(t)$ is a Bernoulli random variable with parameter $p_D = \frac 23$.
If $t \in E$, that is, $|t| < 2$, then $X(t)$ has value $1$ regardless of what value $A$ has; it is a constant! But we fit this into the Procrustean bed of Bernoulli random variables by saying
For all $t \in E$, that is, for all $t$ such that $|t| < 2$, $X(t)$ is a Bernoulli random variable with parameter $p_E = 1$.
Similarly, if $t \in B$, that is, $|t| \geq 8$, then $X(t)$ has value $0$ regardless of what value $A$ has. We fit this also into our Procrustean bed of Bernoulli random variables by saying
For all $t \in B$, that is, for all $t$ such that $|t| \geq 8$, $X(t)$ is a Bernoulli random variable with parameter $p_B = 0$.
Summarizing, $p_X(x;t)$, the pmf of $X(t)$, is the pmf of a Bernoulli random variable with parameter respectively $p_B, p_C, p_D,p_E$ (a.k.a. $0,\frac 13,\frac 23, 1$) according as $t$ is in sets $B, C, D, E$. Note that each of the four possible pmfs for $X(t)$ is a valid pmf: the values of $P(X(t)= 1)$ and $P(X(t)=0)$ sum to $1$ ($0+1=1, \frac 23 + \frac 13 = 1, \frac 13 + \frac 23 = 1, 1+0 = 1$ as the case may be) but I cheerfully admit that $0,\frac 13,\frac 23, 1$ don't add up to $1$. Why should they? Those numbers are just the four possible parameters of the Bernoulli pmf, not the pmf values!
It is worth emphasizing that the random variables $X(t)$ are very dependent, in fact, identical over various intervals. For example, all $X(t)$ for $t \in C$ must take on the same value ($0$ or $1$ as the case may be); it is not possible for, say $X(5.1)$, to have value $1$ while $X(6)$ has value $0$. Even worse, if the random variables $\{X(t) \colon 4 \leq |t| < 8\}$ all have value $1$, then the random variables $\{X(t)\colon 2 \leq |t| < 4\}$ also must all have value $1$ while if $\{X(t)\colon 4 \leq |t| < 8\}$ all have value $0$, then $\{X(t)\colon 2 \leq |t| < 4\}$ could have value either $0$ or $1$. Thus, any $X(t\colon t \in C)$ and any $X(t\colon t \in D)$ are not independent random variables.
Now, since all the $X(t\colon t \in E)$ are degenerate random variables that take on value $1$ with probability $1$ (and thus are identical too), it must be admitted that they also are independent random variables, since for $t_1, t_2 \in E$, $$P(X(t_1)=1,X(t_2)=1)= 1 = P(X(t_1)=1)P(X(t_2)=1).$$ Heck, since we haven't excluded the possibility that $t_1=t_2$, $X(t_1)$ is independent of itself too! Similarly, though the $X(t\colon t \in B)$ are degenerate random variables that take on value $0$ with probability $1$ (and thus are identical too), it must be admitted that they also are independent random variables, since for $t_1, t_2 \in E$, $$P(X(t_1)=0,X(t_2)=0)= 1 = P(X(t_1)=0)P(X(t_2)=0).$$ Similar arguments can be made about $X(t\colon t \in E)$ and $X(t\colon t \in B\cup C\cup D)$, and about $X(t\colon t \in B)$ and $X(t\colon t \in C\cup D \cup E)$, but it is not possible to claim (as the OP wishes to do) that in all cases, random variables in different intervals are independent. The counterexample is provided in the previous paragraph: any $X(t\colon t \in C)$ and any $X(t\colon t \in D)$ are not independent random variables.
Turning to the autocorrelation function $R_X(t,s) = \mathbb E[X(t)X(s)]$, we need to evaluate this expectation for all pairs of real numbers $(t,s)$ where $t$ and $s$ each belong to $B$ or $C$ or $D$ or $E$. There are $16$ possible cases to consider, Let's denote by $G$ the set ($B$ or $C$ or $D$ or $E$) to which $t$ belongs; and by $H$ the set ($B$ or $C$ or $D$ or $E$) to which $s$ belongs.
- If either $G$ or $H$ is $B$ (possibly both are $B$), then at least one of $X(t)$ and $X(s)$ is $0$ and so $R_X(t,s) = \mathbb E[X(t)X(s)] = \mathbb E[0] = 0$. Note that this is saying that $R_X(t,s)$ has value $0$ whenever at least one of $t$ and $s$ has magnitude $8$ or more, that is, the point $(t,s)$ lies on or outside the boundary of the $16\times 16$ square centered at the origin (and sides parallel to the axes). Thus, the nonzero values of $R_X(t,s)$ occur only in the interior of this $16\times 16$ square. Note also that one way of specifying the value of $R_X(t,s)$ in this region is $R_X(t,s) = \min(p_G, p_H)$ since at least one of $G$ and $H$ is $B$ and so at least one of $p_G, p_H$ is $0$.
- If both $G$ and $H$ are $E$, then both $X(t)$ and $X(s)$ are $1$ and so $$R_X(t,s) = \mathbb E[1] = 1 = \min(p_G, p_H),$$ Thus, $R_X(t,s) = 1$ if $(t,s)$ lies in the interior of the $4\times 4$ square centered to the origin.
- Next, consider the hollow-square region consisting of all points $(t,s)$ lying in the interior of the $8\times 8$ square but on or outside the boundary of the $4\times 4$ square. Here, at least one of $G$ and $H$ is $D$ (the other can be $D$ or $E$.) If the other is $E$, then one of $X(t), X(s)$ has value $1$ and so $\mathbb E[X(t)X(s)]= p_D$. If both $G$ and $H$ are $D$, then $X(t)X(s)$ has value $1$ with probability $p_D$. Thus, in this hollow-square region also, we have that $$\mathbb E[X(t)X(s)]= p_D = \frac 23 = \min(p_G, p_H).$$
- Finally, consider the hollow-square region consisting of all points $(t,s)$ lying in the interior of the $16\times 16$ square but on or outside the boundary of the $8\times 8$ square. Here, at least one of $G$ and $H$ is $C$ (the other can be $C$ or $D$ or $E$.) Using the same type of arguments as above, we get that $$\mathbb E[X(t)X(s)]= p_C = \frac 13 = \min(p_G, p_H).$$
In summary, $R_X(t,s)$ is nonzero only in the interior of the $16\times 16$ square with sides parallel to the axes and center the origin.
In the $4\times 4$ central square, $R_X(t,s)$ has constant value $1$.
In the hollow-square region consisting of the $8\times 8$ square less the $4\times 4$ square, $R_X(t,s)$ has constant value $\frac 23$.
In the hollow-square region consisting of the $16\times 16$ square less the $8\times 8$ square, $R_X(t,s)$ has constant value $\frac 13$.