Chapter 23 Homework 3: Poisson Process, Birth and Death Process: Problems and Tentative Solutions

Exercise 23.1 (Birth process, probability of even and odd number of occurence) Let \(X(t)\) be a pure birth continuous time Markov chain. Assume that \[\begin{equation} \begin{split} &P(\text{an event happen in (t,t+h)}|X(t)=\text{odd})=\lambda_1h+o(h)\\ &P(\text{an event happen in (t,t+h)}|X(t)=\text{even})=\lambda_2h+o(h) \end{split} \tag{23.1} \end{equation}\] where \(\frac{o(h)}{h}\to 0\) as \(h\to 0\). Take \(X(0)=0\). Find the following probabilities: \(P_1(t)=P(X(t)=\text{odd})\) and \(P_2(t)=P(X(t)=\text{even})\).

Hint: Derive the differential equations: \[\begin{equation} \begin{split} &P_1^{\prime}(t)=-\lambda_1P_1(t)+\lambda_2P_2(t)\\ &P_2^{\prime}(t)=\lambda_1P_1(t)-\lambda_2P_2(t)\\ \end{split} \tag{23.2} \end{equation}\]

Proof. We start with consider \(P(X(t+h)=\text{odd})\), we have \[\begin{equation} \begin{split} P(X(t+h)=\text{odd})&=\sum_iP(X(t+h)=\text{odd}|X(t)=i)P(X(t)=i)\\ &=P(X(t+h)=\text{odd}|X(t)=\text{even})P(X(t)=\text{even})\\ &+P(X(t+h)=\text{odd}|X(t)=\text{odd})P(X(t)=\text{odd}) \end{split} \tag{23.3} \end{equation}\] using (23.1), we have \[\begin{equation} \begin{split} P(X(t+h)=\text{odd})&=(\lambda_2 h+o(h))P(X(t)=\text{even})\\ &+(1-\lambda_1 h+o(h))P(X(t)=\text{odd}) \end{split} \tag{23.4} \end{equation}\] Now denote \(P_1(t)=P(X(t)=\text{odd})\) and \(P_2(t)=P(X(t)=\text{even})\). By (23.4) we have \[\begin{equation} \begin{split} P_1(t+h)&=(\lambda_2 h+o(h))P_2(t)+(1-\lambda_1 h+o(h))P_1(t)\\ &=\lambda_2 hP_2(t)+(1-\lambda_1 h)P_1(t)+o(h) \end{split} \tag{23.5} \end{equation}\] which implies \[\begin{equation} \frac{P_1(t+h)-P_1(t)}{h}=-\lambda_1 P_1(t)+\lambda_2 P_2(t)+\frac{o(h)}{h} \tag{23.6} \end{equation}\]
Take the limit as \(h\to 0\) on both side, we get \[\begin{equation} P_1^{\prime}(t)=-\lambda_1P_1(t)+\lambda_2P_2(t) \tag{23.7} \end{equation}\]

Very similarly, we can get the other differential equation \[\begin{equation} P_2^{\prime}(t)=\lambda_1P_1(t)-\lambda_2P_2(t) \tag{23.8} \end{equation}\] The boundary conditions are \[\begin{equation} \left\{\begin{aligned} & P_1(0)=0\\ &P_2(0)=1 \end{aligned}\right. \tag{23.9} \end{equation}\]

Equation (23.7) and (23.8) can be combined to write in matrix form as \[\begin{equation} \mathbf{P}^{\prime}(x)=\begin{pmatrix} P^{\prime}_1(x)\\P^{\prime}_2(x)\end{pmatrix}= \begin{pmatrix} -\lambda_1 & \lambda_2\\ \lambda_1 & -\lambda_2 \end{pmatrix} \begin{pmatrix} P_1(x)\\P_2(x)\end{pmatrix}=\boldsymbol{\Lambda}\mathbf{P}(x) \tag{23.10} \end{equation}\]
Suppose \(\mathbf{P}(x)=\boldsymbol{\eta}e^{rt}\), then \(\mathbf{P}^{\prime}(x)=r\boldsymbol{\eta}e^{rt}\), substitute into (23.10), we have \[\begin{equation} (\boldsymbol{\Lambda}-r \mathbf{I})\boldsymbol{\eta}e^{rt}=\mathbf{0} \tag{23.11} \end{equation}\] since \(e^{rt}\) are not zero, we can drop it and finally get
\[\begin{equation} (\boldsymbol{\Lambda}-r \mathbf{I})\boldsymbol{\eta}=\mathbf{0} \tag{23.12} \end{equation}\]
The matrix \(\boldsymbol{\Lambda}\) has two distinct eigenvalues \(r_1=0\) and \(r_2=-\lambda_1-\lambda_2\), with corresponding eigenvector \(\begin{pmatrix} \frac{\lambda_2}{\lambda_1}& 1\end{pmatrix}^T\) and \(\begin{pmatrix} -1& 1\end{pmatrix}^T\), therefore, the solution to (23.10) is \[\begin{equation} \mathbf{P}(x)=c_1\begin{pmatrix} \frac{\lambda_2}{\lambda_1}\\ 1\end{pmatrix}+c_2e^{-(\lambda_1+\lambda_2)t}\begin{pmatrix} -1\\ 1\end{pmatrix}^T \tag{23.13} \end{equation}\]
Since the boundary condition is \(\mathbf{P}(0)=\begin{pmatrix} 0& 1\end{pmatrix}^T\), we can find \(c_1=\frac{\lambda_1}{\lambda_1+\lambda_2}\) and \(c_2=\frac{\lambda_2}{\lambda_1+\lambda_2}\). Thus, we finally get \[\begin{equation} \begin{pmatrix} P_1(x)\\P_2(x)\end{pmatrix}=\begin{pmatrix} \frac{\lambda_2}{\lambda_1+\lambda_2}(1-e^{-(\lambda_1+\lambda_2)t})\\ \frac{\lambda_2}{\lambda_1+\lambda_2}(\frac{\lambda_1}{\lambda_2}+e^{-(\lambda_1+\lambda_2)t})\end{pmatrix} \tag{23.14} \end{equation}\]
Exercise 23.2 (Conditional distribution for Poisson process) If \(\{X(t):t\geq 0\}\) is a Poisson process with rate \(\lambda\), prove that \(X(u)|X(t)=n\sim Bin(n,\frac{u}{t})\) for any \(u<t\).
Proof. From Definition 12.1 and Theorem 12.1, we have that \[\begin{equation} X(u)\sim Pois(\lambda u),\quad X(t)\sim Pois(\lambda t),\quad X(u)\perp \!\!\! \perp X(t)-X(u) \tag{23.15} \end{equation}\] Thus, \(X(t)-X(u)\sim Pois(\lambda (t-u))\). Therefore, we have \[\begin{equation} \begin{split} P(X(u)=j|X(t)=n)&=\frac{P(X(u)=j,X(t)=n)}{P(X(t)=n)}\\ &=\frac{P(X(u)=j)P(X(t)-X(u)=n-j)}{P(X(t)=n)}\\ &=\frac{\frac{(\lambda u)^j}{j!}e^{-\lambda u}\times \frac{(\lambda (t-u))^{n-j}}{(n-j)!}e^{-\lambda (t-u)}}{\frac{(\lambda t)^n}{n!}e^{-\lambda t}}\\ &=\frac{n!}{j!(n-j)!}(\frac{u}{t})^j(1-\frac{u}{t})^{n-j} \end{split} \tag{23.16} \end{equation}\] Thus, \(X(u)|X(t)=n\sim Bin(n,\frac{u}{t})\) as we desired.
Exercise 23.3 (Distribution of arrival times for Poisson process) Let \(\{X(t):t\geq 0\}\) be a Poisson process with rate \(\lambda\). If \(T_1,\cdots,T_n\) are arrival times for the \(n\) events, show that \(P(T_1\leq t_1,\cdots,T_n\leq t_n|X(t)=n)\) is free of \(\lambda\). In fact, given \(X(t)=n\), \(T_1,\cdots,T_n\) are jointly distributed as the distribution of the order statistics from a sample of \(n\) observations taken from the uniform distribution on \([0,t]\). You do not need to show the second part.
Proof. We only need to consider the case when \(0=t_0<t_1\leq t_2\leq t_3\cdots\leq t_n\leq t_{n+1}=t\). In that case, \[\begin{equation} \begin{split} &P(T_1\leq t_1,\cdots,T_n\leq t_n|X(t)=n)=\frac{P(T_1\leq t_1,\cdots,T_n\leq t_n,X(t)=n)}{P(X(t)=n)}\\ &=\frac{\sum_{x_1+\cdots+x_{n+1}=n}P(X(t_1)=x_1,X(t_2)-X(t_1)=x_2,\cdots,X(t)-X(t_n)=x_{n+1})}{P(X(t)=n)}\\ &=\frac{\sum_{x_1+\cdots+x_{n+1}=n}P(X(t_1)=x_1)P(X(t_2)-X(t_1)=x_2)P(X(t)-X(t_n)=x_{n+1})}{P(X(t)=n)}\\ &=\sum_{x_1+\cdots+x_{n+1}=n}\frac{\prod_{i=1}^{n+1}\frac{(\lambda (t_i-t_{i-1}))^{x_i}}{x_i!}e^{-\lambda(t_i-t_{i-1})}}{\frac{(\lambda t)^n}{n!}e^{-\lambda t}}\\ &=\sum_{x_1+\cdots+x_{n+1}=n}\frac{\lambda^{\sum_{i=1}^{n+1}x_i}e^{-\lambda\sum_{i=1}^{n+1}(t_i-t_{i-t})}}{\lambda^n e^{-\lambda t}}\frac{{n \choose {x_1\cdots x_{n+1}}}\prod_{i=1}^{n+1}(t-t_{i-1})^{x_i}}{t^n}\\ &=\sum_{x_1+\cdots+x_{n+1}=n}\frac{{n \choose {x_1\cdots x_{n+1}}}\prod_{i=1}^{n+1}(t-t_{i-1})^{x_i}}{t^n} \end{split} \end{equation}\]
which is free of \(\lambda\) as we desired.
Exercise 23.4 (Relationship of two independent Poisson processes) Consider two independent Poisson processes \(X(t)\) and \(Y(t)\) such that \(E(X(t))=\lambda t\) and \(E(Y(t))=\mu t\). Let two successive events of \(X(t)\) process occur at times \(T\) and \(T^{\prime}>T\), so that \(X(t)=X(T)\) for \(T\leq t<T^{\prime}\) and \(X(T^{\prime})=X(T)+1\). Define \(N=Y(T^{\prime})-Y(T)\) be the random variable that denotes the number of events of the \(Y(t)\) process between \(T\) and \(T^{\prime}\). Show that \[\begin{equation} P(N(t)=m)=\frac{\lambda}{\lambda+\mu}(\frac{\mu}{\lambda+\mu})^m,\quad m=0,1,2,\cdots \tag{23.17} \end{equation}\]
Proof. Consider the Poisson process with rate \(\lambda+\mu\), then independently decide for each event whether it belongs to the first process, with probability \(\frac{\lambda}{\lambda+\mu}\), or the second process, with probability \(\frac{\mu}{\lambda+\mu}\). Then the result two processes are just \(X(t)\) and \(Y(t)\) described in the problem. The probability we are interested in is the probability that before 1 event happens in the process \(X(t)\), \(m\) events happen in the process \(Y(t)\). This is therefore a geometric random variable with probability of success parameter \(\frac{\mu}{\lambda+\mu}\). Thus, we have (23.17) as desired.

Exercise 23.5 (Linear birth and death process, expected number of individuals) Consider a birth and death process \(\{X(t):t\geq 0\}\) with \(\lambda_n=n\lambda+a\) and \(\mu_n=n\mu\), with \(\lambda,\mu,a>0\). If the population starts with \(i\) individuals, prove that \[\begin{equation} M(t)=E[X(t)]=\left\{\begin{aligned} &at+i & \lambda=\mu\\ & \frac{a}{\lambda-\mu}\{e^{(\lambda-\mu)t}-1\}+ie^{(\lambda-\mu)t} & \lambda\neq \mu\end{aligned}\right. \tag{23.18} \end{equation}\]

Proof. If we define \(p_n(t)=P(X(t)=n)\), then we have showed in class that \[\begin{equation} p_n^{\prime}(t)=-(\lambda_n+\mu_n)p_n(t)+\lambda_{n-1}p_{n-1}(t)+\mu_{n+1}p_{n+1}(t) \tag{23.19} \end{equation}\]

Substitute in \(\lambda_n=n\lambda+a\) and \(\mu_n=n\mu\), we have \[\begin{equation} p_n^{\prime}(t)=-(n\lambda+a+n\mu)p_n(t)+((n-1)\lambda+a)p_{n-1}(t)+(n+1)\mu p_{n+1}(t) \tag{23.20} \end{equation}\]

Then since \[\begin{equation} M(t)=E(X(t))=\sum_{n=1}^{\infty}np_n(t) \tag{23.21} \end{equation}\] taking derivatives with respect to \(t\) and substitute in (23.20) we can get \[\begin{equation} \begin{split} M^{\prime}(t)&=\sum_{n=1}^{\infty}np_n^{\prime}(t)\\ &=\sum_{n=1}^{\infty}n[-n(\lambda+\mu)p_n(t)-ap_n(t)+(n-1)\lambda p_{n-1}(t)\\ &+ap_{n-1}(t)+(n+1)\mu p_{n+1}(t)]\\ &=-(\lambda+\mu)\sum_{n=1}^{\infty}n^2p_n(t)+\lambda\sum_{n=1}^{\infty}n(n-1)p_{n-1}(t)+\mu\sum_{n=1}^{\infty}n(n+1)p_{n+1}(t)\\ &-a\sum_{n=1}^{\infty}np_n(t)+a\sum_{n=1}^{\infty}np_{n-1}(t) \end{split} \tag{23.22} \end{equation}\] Using the result from the class notes (Equation (13.18)), we have \[\begin{equation} \begin{split} M^{\prime}(t)&=\lambda M(t)-\mu M(t)+a\sum_{n=0}^{\infty}p_n(t)\\ &=\lambda M(t)-\mu M(t)+a \end{split} \tag{23.23} \end{equation}\]

Firstly, consider the special case, if \(\lambda=\mu\), then the differential equation reduced to \[\begin{equation} M^{\prime}(t)=a \tag{23.24} \end{equation}\] which has the solution \[\begin{equation} M(t)=M(0)+at \tag{23.25} \end{equation}\] Note that \(M(0)=E(N(0))=i\) for a this linear birth and death process. Thus, \[\begin{equation} M(t)=at+i \tag{23.26} \end{equation}\]

In addition, if \(\lambda\neq \mu\), the differential equation is \[\begin{equation} M^{\prime}(t)-(\lambda-\mu)M(t)=a \tag{23.27} \end{equation}\] From ODE class, for a differential equation of the form \[\begin{equation} \frac{dy}{dx}+P(x)y=Q(x),\quad Q(x)\not\equiv 0 \tag{23.28} \end{equation}\] Its solution is \[\begin{equation} y=Ce^{-\int P(x)dx}+e^{-\int P(x)dx}\int Q(x)e^{\int P(x)dx}dx \tag{23.29} \end{equation}\]

For our specific problem, we have \[\begin{equation} \begin{split} M(t)&=Ce^{\int_0^t(\lambda-\mu)dt}+e^{\int_0^t(\lambda-\mu)dt}\int_0^tae^{-\int_0^t(\lambda-\mu)dt}dt\\ &=Ce^{(\lambda-\mu)t}+e^{(\lambda-\mu)t}a\int_0^te^{-(\lambda-\mu)t}dt\\ &=Ce^{(\lambda-\mu)t}+e^{(\lambda-\mu)t}a[\frac{1}{\lambda-\mu}-\frac{1}{\lambda-\mu}e^{-(\lambda-\mu)t}]\\ &=Ce^{(\lambda-\mu)t}+\frac{a}{\lambda-\mu}\{e^{(\lambda-\mu)t}-1\} \end{split} \tag{23.30} \end{equation}\]

The constant \(C\) need to be solved from the boundary condition \(M(0)=i\), substitute \(t=0\) in (23.30), we have \(C=i\). Thus, we finally get \[\begin{equation} M(t)=ie^{(\lambda-\mu)t}+\frac{a}{\lambda-\mu}\{e^{(\lambda-\mu)t}-1\} \tag{23.31} \end{equation}\] as we desired.

Exercise 23.6 (Poisson process, arrival time) A cable car starts off with \(n\) riders. The times between successive stops of the car are independent exponential random variables with rate \(\lambda\). At each stop one rider gets off. This takes no time and no additional riders get on. After a rider gets off the car, he or she walks home. Independently of all else, the walk takes an exponential time with rate \(\lambda\).

  1. What is the distribution of time at which the last rider departs the car?

  2. Suppose the last rider departs the car at time \(t\). What is the probability that the \(i\)th rider is in home by that time, \(i=1,\cdots,n\)?

Proof. For part (a), let \(T\) be the random variable we care about. Then \(T\) is a sum of \(n\) independent exponential random variables with rate \(\lambda\). Therefore, \(T\sim Gamma(n,\lambda)\).

For part (b), suppose a rider gets off at time \(s\), then at time \(t\), the probability that this rider has not arrived at home is \[\begin{equation} \int_{t-s}^{\infty}\lambda e^{\lambda x}dx=e^{-\lambda(t-s)} \tag{23.32} \end{equation}\] Given that the last rider departs the car at time \(t\), the time that the other \(n-1\) rider gets off follows a uniform distribution on \([0,t]\). Hence, the \(i\)th rider is not at home by \(t\) has the probability \[\begin{equation} \int_0^{t}e^{-\lambda(t-s)}\cdot\frac{1}{t}ds=\frac{1-e^{-\lambda t}}{\lambda t} \tag{23.33} \end{equation}\] Thus, the probability that the \(i\)th rider is in home by that time, for \(i=1,\cdots,n\), is \[\begin{equation} P=\left\{\begin{aligned} & 1-\frac{1-e^{-\lambda t}}{\lambda t} & i=1,\cdots,n-1\\ & 0 & i=n\end{aligned}\right. \tag{23.34} \end{equation}\]

Exercise 23.7 (Compound Poisson process and its application) Let \(\{N(t):t\geq 0\}\) be a Poisson process with rate \(\lambda\). Define a sequence of nonnegative, independent and identically distributed random variables \(\{D_i:i\geq 1\}\) independent from \(N(t)\). A stochastic process \(\{Y(t):t\geq 0\}\) is a compound Poisson process if \(Y(t)=\sum_{i=1}^{N(t)}D_i\).

  1. Find \(E[Y(t)]\) and \(Cov(Y(t),Y(s))\).

  2. Customers arrive at an automatic teller machine (ATM) in accordance with a Poisson process with rate 12 per hour. The amount of money withdrawn on each transaction is a random variable with mean $30 and standard deviation $50. (A negative withdrawal means that money was deposited.) Suppose that the machine is in use 15 hours per day. Formulate this scenario as a compound Poisson process to find the expected amount of money withdrawn per day. Can you provide an approximate probability that the total daily withdraw is greater than $5000?

Proof. For part (a), by Wald identity, we have \[\begin{equation} E(Y(t))=E(D_1+\cdots+D_{N(t)})=E(N(t))E(D_1)=\lambda tE(D_1) \tag{23.35} \end{equation}\]

As for the covariance, firstly notice that \[\begin{equation} \begin{split} Var(Y(t))&=E(Var(Y(t)|N(t)))+Var(E(Y(t)|N(t)))\\ &=E(N(t)Var(D))+Var(N(t)E(D))\\ &=Var(D)E(N(t))+E(D)^2Var(N(t))\\ &=Var(D)\lambda t+E(D)^2\lambda t\\ &=\lambda tE(D^2) \end{split} \tag{23.36} \end{equation}\]

Assume w.l.o.g. that \(t<s\), then we have \[\begin{equation} \begin{split} Cov(Y(t),Y(s))&=Cov(Y(t),Y(s)-Y(t)+Y(t))\\ &=Cov(Y(t),Y(t-s))+Var(Y(t))=Var(Y(t))=\lambda tE(D^2) \end{split} \tag{23.37} \end{equation}\] Therefore, in general we have \[\begin{equation} Cov(Y(t),Y(s))=\lambda \min\{s,t\}E(D^2) \tag{23.38} \end{equation}\]

For part (b), let \(Y(t)\) be the total amount of money withdrawn in the interval \([0,t]\), where the time \(t\) is measured in hours. Assuming that the successive amounts withdrawn are independent and identically distributed random variables, the stochastic process \(\{Y(t):t\geq 0\}\) is then a compound Poisson process, and \(Y(15)\) denotes the daily withdraw. Its mean and variance can be compute using (23.35) and (23.36), that is \[\begin{equation} \begin{split} &E(Y(15))=12\times 15\times30=5400\\ &Var(Y(15))=12\times 15\times(30^2+50^2)=612000 \end{split} \tag{23.39} \end{equation}\] Therefore, by CLT we have \[\begin{equation} P(Y(t)\geq 5000)=P(Z\geq \frac{5000-5400}{\sqrt{612000}})\approx 0.6954 \tag{23.39} \end{equation}\] Thus, the approximate probability that the total daily withdraw is greater than $5000 is 0.6954.

Exercise 23.8 (Process that descirbe the decay of particles) At each (discrete) time point \(n=0,1,2,\cdots\), a number \(Y_n\) of particles enters a chamber, where the \(Y_n\), \(n\geq 0\), are independent and identically Poisson distributed with parameter \(\lambda\), that is, for all \(n\geq 0\), \(Pr(Y_n=y)=\lambda^y\exp(-\lambda)/y!\), for \(y\in\{0,1,2,\cdots\}\). Each particle in the chamber may decay during two successive time points with probability \(q\) (where \(0<q<1\)) which is constant over time and the same for all particles. Moreover, particles decay independently of each other and \(Y_n\) is independent of the number of particles that decay between time points \(n-1\) and \(n\). Let \(X_n\) be the number of particles in the chamber at time \(n\), where \(X_n\) includes particles present just after the entry of the \(Y_n\) particles.

  1. Show that \(X=\{X_n:n\geq 0\}\) is a time homogeneous Markov chain, and prove that its transition probabilities are given by \[\begin{equation} P(X_{n+1}=j|X_n=i)=\exp(-\lambda)\sum_{r=0}^{\min\{i,j\}}\frac{i!}{(i-r)!r!(j-r)!}(1-q)^rq^{i-r}\lambda^{j-r} \tag{23.40} \end{equation}\]

  2. Show that the limiting probabilities for the number of particles in the chamber are given by \[\begin{equation} \lim_{n\to\infty}P(X_n=j)=\exp(-\lambda q^{-1})\frac{(\lambda q^{-1})^j}{j!},\quad j\in\{0,1,2,\cdots\} \tag{23.41} \end{equation}\]

Proof. For part (a), let \(Z_n\) denote the number of particles remained in the chamber at time \(t=n\), then we have \[\begin{equation} \begin{split} P(X_{n+1}=j|X_n=i)&=\sum_{r=0}^{\min\{i,j\}}P(X_{n+1}=j|Z_n=r,X_n=i)P(Z_n=r|X_n=i)\\ &=\sum_{r=0}^{\min\{i,j\}}P(Y_{n+1}=j-r)P(Z_n=r)\\ &=\sum_{r=0}^{\min\{i,j\}}\lambda^{j-r}\exp(-\lambda)\frac{i!}{(j-r)!r!(i-r)!}(1-q)^rq^{i-r}\\ &=\exp(-\lambda)\sum_{r=0}^{\min\{i,j\}}\frac{i!}{(i-r)!r!(j-r)!}(1-q)^rq^{i-r}\lambda^{j-r} \end{split} \tag{23.42} \end{equation}\] as we desried. Since \(P(X_{n+1}=j|X_n=i)\) is a constant as a function of time \(t\), the Markov chain is time homogeneous.

For part (b), since \(X\) is a irreducible Markov chain, we only need to check (23.40) and (23.41) satisfies the detailed balance condition (Proposition 10.2). In fact, for any \(i,j\) \[\begin{equation} \begin{split} \pi_ip_{ij}&=\exp(-\lambda q^{-1})\frac{(\lambda q^{-1})^i}{i!}\exp(-\lambda)\sum_{r=0}^{\min\{i,j\}}\frac{i!}{(i-r)!r!(j-r)!}(1-q)^rq^{i-r}\lambda^{j-r}\\ &=\exp(-\lambda (q^{-1}+1))\sum_{r=0}^{\min\{i,j\}}\frac{(1-q)^rq^{-r}\lambda^{i+j-r}}{(i-r)!r!(j-r)!} \end{split} \tag{23.43} \end{equation}\]
and similarly, \[\begin{equation} \begin{split} \pi_jp_{ji}&=\exp(-\lambda q^{-1})\frac{(\lambda q^{-1})^j}{j!}\exp(-\lambda)\sum_{r=0}^{\min\{i,j\}}\frac{j!}{(i-r)!r!(j-r)!}(1-q)^rq^{j-r}\lambda^{i-r}\\ &=\exp(-\lambda (q^{-1}+1))\sum_{r=0}^{\min\{i,j\}}\frac{(1-q)^rq^{-r}\lambda^{i+j-r}}{(i-r)!r!(j-r)!} \end{split} \tag{23.44} \end{equation}\] Thus, we have \(\pi_ip_{ij}=\pi_jp_{ji}\), (23.41) gives the stationary distribution. Since any state of \(X\) is non-null persistent, the stationary distribution is actually the limiting distribution.