Suppose, $x_1, \dots, x_n, \dots$ are random variable on $\mathbb R$, A Markov chain is said to be topologically recurrent if for any open set $O$ and starting point $x$, $\mathbb P_x(\sum\limits_{n=1}^{\infty} 1_{O}(x_n)=+\infty)=1$, where $1_O(x_n)=1$ if the random variable $x_n\in O$ and $0$ if not, and $\mathbb P_x$ is the probability for the initial value. Could anyone help me to understand the definition? I am not getting any intuition behind it. Does it saying that the Markov chain will visit every open set of $\mathbb R$ with probability $1$ after a long time i.e $n\to \infty$?
1 Answer
$\begingroup$ $\endgroup$
2 The series $\sum_{n=1}^{+\infty}1_O(x_n)$ is infinite if and only if $\{n\mid x_n\in O\}$ is infinite. Therefore, the notion of topologically recurrent means that for each starting point and each open set, the Markov chain will visit the open set an infinite amount of times.
- $\begingroup$ Actually it is the first time that I meet the concept so I cannot give you more reference on it. $\endgroup$Davide Giraudo– Davide Giraudo2019-08-14 09:54:14 +00:00Commented Aug 14, 2019 at 9:54
- $\begingroup$ The concept seems appropriate for e.g. random walks with increments in $\mathbf R$ that are not lattice. $\endgroup$Olivier– Olivier2019-08-15 09:13:33 +00:00Commented Aug 15, 2019 at 9:13