The following Berry-Esseen theorem was obtained by Stein's method:
Theorem (Chaidee and Keammanee, 2008, Theorem 2.1). Let $X_1, X_2, \dots$ be independent, identically-distributed random variables with $\mathbb{E}X_i = 0$, $\mathbb{E}X_i^2 = \sigma^2$, and $\mathbb{E}\lvert X_i \rvert^3 = \gamma < \infty$. Let $N$ be integer-valued random variable with $\mathbb{E} N^{3/2} < \infty$ and $N, X_1, X_2, \dots$ are independent. Define $$Y_i = \frac{X_i}{\sqrt{\mathbb{E}N} \sigma}$$ and $$W = Y_1 + Y_2 + \dots + Y_N$$ Then there exists a constant $C$ such that $$\sup_{x \in \mathbb{R}} \lvert \mathbb{P}(W \leq x) - \Phi(x) \rvert \leq 10.32\delta + \frac{0.125}{\sqrt{\mathbb{E}N}} + \frac{1.5\delta\mathbb{E}N^{3/2}}{(\mathbb{E}N)^{3/2}} + \frac{\mathbb{E}|N - \mathbb{E}N|}{\mathbb{E}N} \tag{1}\label{equation:chaidee}$$ where $\delta = \dfrac{\gamma}{\sqrt{\mathbb{E}N}\sigma^3}$.
Here, $\Phi(x)$ is the cumulative distribution function for the normal distribution with mean 0, variance 1.
I am trying to reconcile it with Robbins (1948)
Theorem (Robbins, 1948, Corollary 4). Let $\{U_i, i \geq 1\}$ be independent, identically-distributed random variables with $\mathbb{E}U_i = \mu$, $\text{Var}(U_i) = \sigma^2$. Let $\{N_n, n \geq 1\}$ be a sequence of non-negative integer-valued random variables that are independent of $\{U_i, i \geq 1\}$. Assume that $\mathbb{E}(N_n^2) < \infty$ for all $n$ and $$ \frac{N_n - \mathbb{E}(N_n)}{\sqrt{\text{Var}(N_n)}} \ \text{converges in distribution to} \ \mathcal{N}(0,1) $$ as $n \to \infty$. Then $$ \frac{\sum_{i=1}^{N_n} U_i - \mu\mathbb{E}(N_n)}{\sqrt{\sigma^2\mathbb{E}(N_n) + \mu^2 \text{Var}(N_n)}} \ \text{converges in distribution to} \ \mathcal{N}(0,1) \tag{2}\label{equation:robbins} $$ as $n \to \infty$.
where again, $\mathcal{N}(0,1)$ denotes the normal distribution with mean 0, variance 1.
Note that if $T_N = X_1 + \dots + X_N$ then by the law of total variance $$\begin{align} \text{Var}(T_N) &= \mathbb{E}[\text{Var}(T_N|N)] + \text{Var}(\mathbb{E}[T_N|N]) \\ &= \mathbb{E}[N\sigma^2] + \text{Var}(0 \cdot N) \\ &= \sigma^2\mathbb{E}(N) \end{align}$$
Let $X_i = U_i - \mu$ where $\mathbb{E}(U_i) = \mu$. Note $\text{Var}(U_i) = \text{Var}(X_i + \mu) = \text{Var}(X_i) = \sigma^2$. Moreover, if $S_N = U_1 + U_2 + \dots + U_N$ then $$\begin{align} \mathbb{E}(S_N) &= \mu \mathbb{E}(N) \\ \text{Var}(S_N) &= \sigma^2\mathbb{E}(N) + \mu^2 \text{Var}(N) \end{align}$$ so equation (\ref{equation:robbins}) is $$ \frac{S_{N_n} - \mathbb{E}(S_{N_n})}{\sqrt{\text{Var}(S_{N_n})}} \ \text{converges in distribution to} \ \mathcal{N}(0,1) $$
Meanwhile equation (\ref{equation:chaidee}) is $$ \sup_{x \in \mathbb{R}} \left\lvert \mathbb{P}\left(\frac{S_N - N\mu}{\text{Var}(S_N - N\mu)} \leq x \right) - \Phi(x) \right\rvert \leq \dots $$
I can't see how to apply the additional conditions of Robbins (1948) to close the equivalence?