# Chapter 6 Period, Ergodic, Communicate (Lecture on 01/21/2021)

We looked at random walk in different dimensions and proved that in dimensions 1 and 2, state 0 is recurrent, but in dimensions more than 3, state 0 is transient. We have seen that states can be classified into transient state and recurrent state. For recurrent state it can be further classified into null recurrent state and non-null recurrent state. We saw one example where we found out whether state 0 is null/non-null recurrent for a one-dimensional random walk.

In that kind of problem we cares about whether \(f_{00}=1\) or \(f_{00}<1\). However, usually it is hard to find \(f_{00}(n)\) directly, so we tend to \(p_{ii}(n)\), which is much easier to calculate, then use Corollary 4.1 to get whether \(f_{00}=1\) or \(f_{00}<1\). That is not always the case. In the following example, we will see that calculate \(f_{00}(n)\) directly is actually more easier.

**Example 6.1 (Success runs of a Binomial trail) **The Markov chain is given by an infinite dimensional transition probability matrix
\[\begin{equation}
\begin{pmatrix}
p_0 & 1-p_0 & 0 & 0 & 0 & 0 & \cdots\\
p_1 & 0 & 1-p_1 & 0 & 0 & 0 & \cdots\\
p_2 & 0 & 0 & 1-p_2 & 0 & 0 & \cdots\\
\vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots \\
p_r & 0 & \cdots & \cdots & \cdots & 1-p_r & 0 & \cdots\\
\vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots
\end{pmatrix}
\tag{6.1}
\end{equation}\]
This transition matrix consiste of a column vector \((p_0,p_1,\cdots,p_r,\cdots)\) and a diagonal matrix \(diag(1-p_0,1-p_1,\cdots,1-p_r,\cdots)\).

**Question**: Provide conditions on \(p_0,\cdots,p_r,\cdots\), such that state 0 becomes a recurrent state.

Let \(f_{00}(n)\) be the probability of first time visiting state 0 at the n-th time given that \(x_0=0\). We then have \[\begin{equation} \begin{split} &f_{00}(1)=p_0=1-(1-p_0)\\ &f_{00}(2)=(1-p_0)p_1=(1-p_0)[1-(1-p_1)]=(1-p_0)-(1-p_0)(1-p_1)\\ &f_{00}(3)=(1-p_0)(1-p_1)p_2=(1-p_0)(1-p_1)[1-(1-p_2)]\\ &=(1-p_0)(1-p_1)-(1-p_0)(1-p_1)(1-p_2)\\ &\cdots\cdots\cdots\cdots\\ &f_{00}(n)=\prod_{i=0}^{n-2}(1-p_i)-\prod_{i=0}^{n-1}(1-p_i) \end{split} \tag{6.2} \end{equation}\] Then we have \[\begin{equation} \sum_{n=1}^{m+1}f_{00}(n)=\sum_{n=1}^{m+1}[\prod_{i=0}^{n-2}(1-p_i)-\prod_{i=0}^{n-1}(1-p_i)]=1-\prod_{i=0}^m(1-p_i) \tag{6.3} \end{equation}\] and we get \[\begin{equation} \begin{split} f_{00}&=\lim_{m\uparrow\infty}\sum_{n=1}^{m+1}f_{00}(n)\\ &=\lim_{m\to\infty}[1-\prod_{i=0}^m(1-p_i)]\\ &=1-\lim_{m\to\infty}\prod_{i=0}^m(1-p_i) \end{split} \tag{6.4} \end{equation}\]

Thus, when \(\sum_{i=0}^{\infty}p_i=\infty\), it is equivalent to \(f_{00}=1-\lim_{m\to\infty}\prod_{i=0}^m(1-p_i)=1\), which is also equivalent to \(0\) is recurrent state.**Result:**\(\lim_{m\to\infty}\prod_{i=0}^m(1-p_i)=0\Longleftrightarrow \sum_{i=0}^{\infty}p_i=\infty\).

**Definition 6.1 (Period of State)**The period \(d(i)\) of a state \(i\) is defined by \(d(i)=gcd\{n:p_{ii}(n)>0\}\), which means the greatest common divisor of the times at which a return from \(i\) to \(i\) is possible. We call \(i\)

**periodic**if \(d(i)>1\) and

**aperiodic**if \(d(i)=1\).

**Example 6.2 (Period of one-dimensional random walk)**For a one-dimensional random walk, state 0 is periodic with period \(d(0)=2\). This is because \(p_{00}(2n+1)=0\), so the greatest common divisor of the times at which a return from state \(0\) to 0 is possible is 2. Indeed, every state of a one dimensional random walk is periodic with period 2.

**Definition 6.2 (Ergodic)**A state is called Ergodic if it is persistent, non-null recurrent and aperiodic.

We are now going to do classification of the states in Markov chain.

**Definition 6.3 (Communicate and Intercommunicate) **We say state \(i\) communicates with state \(j\), written as \(i\to j\), if the chain ever visit state \(j\) with positive probability starting from state \(i\). That is, \(i\to j\) if \(p_{ij}(m)>0\) for some \(m\geq 0\).

We say state \(i\) and \(j\) **intercommunicate** if \(i\to j\) and \(j\to i\). In which case we write \(i\leftrightarrow j\).

**Definition 6.4 **\(i\leftrightarrow j\) is a **equivalence relationship** if it satisfies:

**Reflexive**: \(i\leftrightarrow i\).**Transitive**: \(i\leftrightarrow j\) and \(j\leftrightarrow k\), then \(i\leftrightarrow k\).**Symmetric**: \(i\leftrightarrow j\quad \Longrightarrow\quad j\leftrightarrow i\)

**Theorem 6.1**Intercommunication is a equivalence relationship.

*Proof. * Firstly, \(i\leftrightarrow i\) is trival because \(p_{ii}(0)=1\). Then for symmatricity, by definition it is obvious \(i\leftrightarrow j\) implies \(j\leftrightarrow i\). We only need to show transitivity. That is, if \(i\leftrightarrow j\) and \(j\leftrightarrow k\), we need to show \(i\leftrightarrow k\).

Since \(i\to j\), by definition, there exist \(m\) such that \(p_{ij}(m)>0\) and since \(j\to k\), there exist \(n\) such that \(p_{jk}(n)>0\). Using Chapman-Kolmogorov equation (Theorem 3.2), we have
\[\begin{equation}
p_{ik}(m+n)=\sum_{r=1}^{m+n}p_{ij}(r)p_{jk}(m+n-r)\stackrel{\text{take}\,r=m}{\geq} p_{ij}(m)p_{jk}(n)>0
\tag{6.5}
\end{equation}\]

which implies \(i\to k\).

**Theorem 6.2 **If \(i\leftrightarrow j\), then

- \(i\) and \(j\) have the same period

- \(i\) is transient if and only if \(j\) is transient. (Equivalently, \(i\) is persistent if and only if \(j\) is persistent.)

- \(i\) is (non-)null persistent if and only if \(j\) is (non-)null persistent.

*Proof. *
For part (a), let \(d(i)=gcd\{m:p_{ii}(m)>0\}\). Similarly, let \(d(j)=gcd\{m:p_{jj}(m)>0\}\). Since \(i\leftrightarrow j\), there exists \(h\) and \(l\) such that \(p_{ij}(h)>0\) ad \(p_{ji}(l)>0\). By Chapman-kolmogorov theorem, \(p_{ii}(h+l)\geq p_{ij}(h)p_{ji}(l)>0\). Thus, \(h+l\) is divisible by \(d(i)\).

Now take any \(m\in\{n:p_{jj}(n)>0\}\), we have \(p_{jj}(m)>0\). By Chapman-Kolmogorov, \(p_{ii}(h+l+m)\geq p_{ij}(h)p_{jj}(m)p_{ji}(l)>0\), which implies that \(h+l+m\) is divisible by \(d(i)\). Therefore, \((h+l+m)-(h+l)=m\) is also divisible by \(d(i)\). Since \(m\) is any time such that \(p_{jj}(m)>0\), we have showed that for any \(m\) such that \(p_{jj}(m)>0\), it is divisible by \(d(i)\). Therefore, \(gcd\{m:p_{jj}(m)>0\}=d(j)\) is divisible by \(d(i)\).

We can similarly show that \(d(i)\) is divisible by \(d(j)\). Therefore \(d(i)=d(j)\) and part (a) is proved.

Then for part (b), since \(i\leftrightarrow j\), there exist \(h\) and \(l\) such that \(p_{ij}(h)>0\) and \(p_{ji}(l)>0\). Let \(\alpha=p_{ij}(h)p_{ji}(l)>0\). Now, by Chapman-Kolmogorov equation, \(p_{ii}(h+l+r)\geq p_{ij}(h)p_{ji}(l)p_{jj}(r)\). Therefore, \(p_{ii}(h+l+r)\geq \alpha p_{jj}(r)\) and \(\sum_{r}p_{ii}(h+l+r)\geq \alpha \sum_{r}p_{jj}(r)\). Thus, \(\sum_{r}p_{jj}(r)<\infty\) if \(\sum_{r}p_{ii}(h+l+r)<\infty\). Now \(\sum_{r}p_{ii}(h+l+r)<\infty\) if \(i\) is transient. Therefore, we have proved that \(j\) is transient if \(i\) is transient.

We can reverse the role of \(i\) and \(j\) to show that \(i\) is transient if \(j\) is transient, (b) is proved.If we work with a real life scenario with all intercommunicating states. In order to show that all states are ergodic, we just need to show that one of these states is ergoric.