1
$\begingroup$

Suppose, $x_1, \dots, x_n, \dots$ are random variable on $\mathbb R$, A Markov chain is said to be topologically recurrent if for any open set $O$ and starting point $x$, $\mathbb P_x(\sum\limits_{n=1}^{\infty} 1_{O}(x_n)=+\infty)=1$, where $1_O(x_n)=1$ if the random variable $x_n\in O$ and $0$ if not, and $\mathbb P_x$ is the probability for the initial value. Could anyone help me to understand the definition? I am not getting any intuition behind it. Does it saying that the Markov chain will visit every open set of $\mathbb R$ with probability $1$ after a long time i.e $n\to \infty$?

$\endgroup$

1 Answer 1

1
$\begingroup$

The series $\sum_{n=1}^{+\infty}1_O(x_n)$ is infinite if and only if $\{n\mid x_n\in O\}$ is infinite. Therefore, the notion of topologically recurrent means that for each starting point and each open set, the Markov chain will visit the open set an infinite amount of times.

$\endgroup$
2
  • $\begingroup$ Actually it is the first time that I meet the concept so I cannot give you more reference on it. $\endgroup$ Commented Aug 14, 2019 at 9:54
  • $\begingroup$ The concept seems appropriate for e.g. random walks with increments in $\mathbf R$ that are not lattice. $\endgroup$ Commented Aug 15, 2019 at 9:13

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.