6
$\begingroup$

In a previous post I asked help to clarify a property of stable convergence in distribution:

Definition

Let $X_n$ be a sequence of random variables defined on a probability space $(\Omega,\mathcal{F},\mathbb{P})$ with value in $\mathbb{R}^N$. We say that the sequence $X_n$ converges stably in distribution with limit $X$, written $X_n\stackrel{\text{st}}{\longrightarrow} X$, if and only if, for any bounded continuous function $f:\mathbb{R}^N\to\mathbb{R}$ and for any $\mathcal{F}$-measurable bounded random variable $W$, it happens that: $$ \lim_{n\rightarrow \infty}\mathbb{E}[f(X_n)\,W]=\mathbb{E}[f(X)\,W]. $$

What I need to prove now is the following:

Assume $$ (Y_n,Z)\stackrel{\text{d}}{\longrightarrow}(Y,Z), $$

for all measurable random variable $Z$, then

$$ (Y_n,Z)\stackrel{\text{st}}{\longrightarrow}(Y,Z) $$ for all measurable random variables $Z$. So I need to prove that, for any bounded continuous function $f$ and for any measurable $Z$ it holds that $$ \lim_{n\rightarrow \infty}\mathbb{E}[f(Y_n,Z)\,W]=\mathbb{E}[f(Y,Z)\,W] $$ for all bounded random variables $W$.

I tried unsuccessfully with Portmanteau and Levy continuity theorem…

=================================================================

In practice I am trying to prove this proposition from the paper by Podolskij and Vetter:

I did this reasoning for (1)=>(3), but I am not so sure of its correctness.

$\endgroup$
6
  • 1
    $\begingroup$ You will certainly need some assumption on the integrability of $W$; if $W$ is not integrable, then the expectations are not even well-defined. It seems to me that the paper assumes that $W$ is bounded (in the sense that $\|W\|_{L^{\infty}} < \infty$), and this simplifies the proof a lot. $\endgroup$ Commented Apr 13, 2018 at 14:23
  • 1
    $\begingroup$ How did you face the problem with the Portmanteau? The idea can be to take as particular $Z$ the $W$, but probably you will need some additional assumptions about $W$ as noticed by @saz. $\endgroup$ Commented Apr 13, 2018 at 14:43
  • $\begingroup$ Yes, sorry, the $W$ must be bounded. $\endgroup$ Commented Apr 13, 2018 at 14:46
  • 1
    $\begingroup$ In line with the saz comment, let $X$ be any random variable with finite mean but infinite variance, and define $X_n = X/n$. Then $X_n\rightarrow 0$ in distribution but $E[X_nX] = \infty$ for all $n$. $\endgroup$ Commented Apr 13, 2018 at 14:46
  • $\begingroup$ @AlmostSureUser Does $\mathbb R^N$ mean $N$-dimensional real space or $\mathbb R^{\mathbb N}$? $\endgroup$ Commented Apr 16, 2018 at 7:58

2 Answers 2

1
+50
$\begingroup$

$\def\dto{\xrightarrow{\mathrm{d}}}\def\stto{\xrightarrow{\mathrm{st}}}\def\mto{\xrightarrow{\mathrm{m}}}$$(3) \Rightarrow (2)$: Trivial.

$(2) \Rightarrow (1)$: For any $g \in C_b(\mathbb{R}^N)$ and bounded $\mathscr{F}$-measurable $W$, suppose $|W| \leqslant M$. Take\begin{align*} f: \mathbb{R}^N × \mathbb{R} &\longrightarrow \mathbb{R},\\ (y, z) &\longmapsto g(y) · \frac{1}{2} (|z + M| - |z - M|). \end{align*} Because $(Y_n, W) \dto (Y, W)$ and $f \in C_b(\mathbb{R}^{N + 1})$, then$$ E(g(Y_n) W) = E(f(Y_n, W)) \to E(f(Y, W)) = E(g(Y) W). \quad n \to \infty $$ Therefore, $Y_n \stto Y$.

$(1) \Rightarrow (3)$: Suppose $Z$ and $W$ are $\mathscr{F}$-measurable and $W$ is bounded. First, for any $A \in \mathscr{B}(\mathbb{R}^N)$ and $B \in \mathscr{B}(\mathbb{R})$, there exists $\{g_k\} \subseteq C_b(\mathbb{R}^N)$ such that $g_k \mto I_A$, i.e.$$ m(\{ x \in \mathbb{R}^N \mid g_k(x) \neq I_A(x)\}) \to 0. \quad k \to \infty $$ For any $k \geqslant 1$, because $Y_n \stto Y$ and $I_B(Z) W$ is $\mathscr{F}$-measurable and bounded, then$$ E(g_k(Y_n) I_B(Z) W) \to E(g_k(Y) I_B(Z) W). \quad n \to \infty $$ Note that $g_k \mto I_A$ and $I_B(Z) W$ is bounded, thus$$ E(I_A(Y_n) I_B(Z) W) \to E(I_A(Y) I_B(Z) W). \quad n \to \infty \tag{1} $$

Now, for any $C \in \mathscr{B}(\mathbb{R}^{N + 1})$, there exists $\{A_{k, j}\} \subseteq \mathscr{B}(\mathbb{R}^N)$ and $\{B_{k, j}\} \subseteq \mathscr{B}(\mathbb{R})$ such that $\{h_k\}$ defined by\begin{align*} h_k : \mathbb{R}^N × \mathbb{R} &\longrightarrow \mathbb{R},\\ (y, z) &\longmapsto \sum_{j = 1}^{s_k} I_{A_{k, j}}(y) I_{B_{j, k}}(z) \end{align*} satisfies $h_k \mto I_C$. For any $k \geqslant 1$, from (1) there is$$ E(h_k(Y_n, Z) W) \to E(h_k(Y, Z) W). \quad n \to \infty $$ Because $h_k \mto I_C$ and $W$ is bounded, then$$ E(I_C(Y_n, Z) W) \to E(I_C(Y, Z) W). \quad n \to \infty \tag{2} $$

Now, for any $f \in C_b(\mathbb{R}^{N + 1})$, there exists a sequence of simple functions $\{f_k\}$ such that $f_k \rightrightarrows f$. For any $k \geqslant 1$, from (2) there is$$ E(f_k(Y_n, Z) W) \to E(f_k(Y, Z) W). \quad n \to \infty $$ Because $f_k \rightrightarrows f$ and $W$ is bounded, then$$ E(f(Y_n, Z) W) \to E(f(Y, Z) W). \quad n \to \infty $$ Therefore, $(Y_n, Z) \stto (Y, Z)$.

$\endgroup$
7
  • $\begingroup$ The proof of $(1)\Rightarrow (3)$ I had in mid was much simpler: Assume $$Y_n\stackrel{st}{\rightarrow} Y.$$ Then, by definition, $$ E[g(Y_n)\,W]\to E[g(Y)\,W] $$ for any bounded continuous function $g(y)$ and for any bounded random variable $W$. Now consider any bounded continuous function $f(y,z)$, an arbitrary $\mathcal{F}$-measurable variable $Z$ and note that $$E[f(Y_n,Z)\,W]=E[E[f(Y_n,c)\,W | Z=c]]\rightarrow E[E[f(Y,c)\,W | Z=c]]= E[f(Y,Z)\,W].$$ $\endgroup$ Commented Apr 16, 2018 at 14:29
  • $\begingroup$ @AlmostSureUser This works only when $Z$ is a continuous or discrete random variable. For $Z$ in general, the conditinal expectation is hard to rigorously characterize. $\endgroup$ Commented Apr 16, 2018 at 14:41
  • $\begingroup$ I do not understand the meaning of the double arrow. $\endgroup$ Commented Apr 23, 2018 at 7:25
  • $\begingroup$ @AlmostSureUser It means uniform convergence. $\endgroup$ Commented Apr 23, 2018 at 7:40
  • 1
    $\begingroup$ @AlmostSureUser Suppose $M_k=\|f_k-f\|$ and $|W|\leqslant M$. Because\begin{align*}|E(f(Y_n, Z) W)-E(f(Y, Z) W)|&\leqslant|E(f(Y_n, Z) W)-E(f_k(Y_n, Z) W)|\\&\quad+|E(f_k(Y_n, Z) W)-E(f_k(Y, Z) W)|\\&\quad+|E(f_k(Y, Z) W)-E(f(Y, Z) W)|\\&\leqslant M_kM+|E(f_k(Y_n, Z) W)-E(f_k(Y, Z) W)|+M_kM,\end{align*}and $M_k→0$ ($k→∞$), so$$\varlimsup_{n→∞}|E(f(Y_n, Z) W)-E(f(Y, Z) W)|\leqslant2M_kM,$$and make $k→∞$ to get $E(f(Y_n, Z) W)→E(f(Y, Z) W)$. $\endgroup$ Commented Apr 23, 2018 at 8:16
1
$\begingroup$

What I suggested in the comment was the following idea: by Portmanteau $$ (Y_n,Z)\stackrel{\text{d}}{\longrightarrow}(Y,Z), $$ IFF $$ \lim_{n\rightarrow \infty}\mathbb{E}[f(X_n, \, Z)]=\mathbb{E}[f(X,\,Z)] $$ for any measurable bounded continuous function $f:\mathbb{R}^{N+1}\to\mathbb{R}$.

Then take as particular $Z := W$. Then you should have that $$ (Y_n)\stackrel{\text{st}}{\longrightarrow}(Y). $$ since you can seen $f(Y_n,W)\,W$ for any $f$ C.B. as a particular $f_1(Y_n,W)$ C.B. under the assumption on $W$.

Moreover $$ (Y_n,Z)\stackrel{\text{st}}{\longrightarrow}(Y,Z). $$ should follow from $$ (Y_n)\stackrel{\text{st}}{\longrightarrow}(Y). $$ applying bounded convergence theorem.

Is it right?

$\endgroup$
2
  • $\begingroup$ why bounded convergence? It requires point wise convergence: math.stackexchange.com/questions/235511/… $\endgroup$ Commented Apr 15, 2018 at 20:58
  • $\begingroup$ @AlmostSureUser You are right, I was too sloppy: bounded convergence is not immediate. Your argument seems right to me. Anyway, you can also prove it, as in proposition 2.5 (i) of Podolskij and Vetter paper, taking the sequence $V_n$ that converges in probability to $V$ constant and equal to $Z$. $\endgroup$ Commented Apr 16, 2018 at 15:37

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.