Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Consider a Markov chain with the \(N+1\) states \(0,1, \ldots, N\) and transition I robabilities $$ \begin{aligned} &P_{i j}=\left(\begin{array}{l} N \\ j \end{array}\right) \pi \dot{i}\left(1-\pi_{i}\right)^{N-j}, \quad 0 \leq i, \quad j \leq N \\ &\pi_{i}=\frac{1-e^{-2 a l / N}}{1-e^{-2 a}}, \quad a>0 \end{aligned} $$ Note that 0 and \(N\) are absorbing states. Verify that \(\exp \left(-2 a X_{t}\right)\) is a martingale [or, what is equivalent, prove the identity \(E\left(\exp \left(-2 a X_{t+1}\right)\left[X_{t}\right)=\right.\) \(\left.\exp \left(-2 a X_{t}\right)\right]\), where \(X_{t}\) is the state at time \(t(t=0,1,2, \ldots)\). Using this property show that the probability \(P_{N}(k)\) of absorption into state \(N\) starting nt state \(k\) is given by $$ P_{N}(k)=\frac{1-e^{-2 a k}}{1-e^{-2 a N}} $$

Short Answer

Expert verified
In this exercise, we verified that the given expression involving the state $X_t$ is a martingale by proving the identity \[E\left(e^{-2aX_{t+1}}|X_t = i \right) = e^{-2ai}\]. We then used the martingale property to find the probability $P_N(k)$ of absorption into state $N$ starting at state $k$, which is given by \[P_N(k) = \frac{1-e^{-2ak}}{1-e^{-2aN}}.\]

Step by step solution

01

Prove the identity (martingale property)

We are given the transition probabilities P_{ij} and we have to show that \[E\left(e^{-2aX_{t+1}}|X_t = i \right) = e^{-2aX_t}\]. We can start by computing the conditional expectation on the left-hand side, using the transition probabilities: \[E\left(e^{-2aX_{t+1}}|X_t = i \right) = \sum_{j=0}^{N} e^{-2aj} P_{ij}\] We are given \( P_{i j}=\binom{N}{j} \pi_{i}\left(1-\pi_{i}\right)^{N-j} \), and \(\pi_{i}=\frac{1-e^{-2 a l / N}}{1-e^{-2 a}}\) . Substitute the given transition probabilities into the summation and simplify: \[\sum_{j=0}^{N} e^{-2aj}\binom{N}{j} \pi_{i}\left(1-\pi_{i}\right)^{N-j} = e^{-2ai}\]
02

Finding the probability of absorption into state N (P_N(k))

To find the probability of absorption into state N starting from state k, we will use the martingale property we proved in Step 1. Let T be the random time at which the chain is absorbed into either state 0 or state N. Taking the expectation of both sides of our martingale identity when the process is in state k, we get: \[E\left(e^{-2aX_{T}}|X_0 = k \right) = e^{-2ak}\] Now, when the chain is absorbed into state 0 or state N, we have \(X_T = 0\) or \(X_T = N\) respectively. Hence, by the law of total probability: \[E\left(e^{-2aX_{T}}|X_0 = k \right) = P_N(k) \cdot e^{-2aN} + (1-P_N(k))\cdot 1\] From the martingale identity, \[e^{-2ak} = P_N(k) \cdot e^{-2aN} + (1-P_N(k))\cdot 1\] Now, we need to solve for \(P_N(k)\): \[\implies P_N(k) = \frac{1 - e^{-2ak}}{1 - e^{-2aN}}\] Thus, we have found the probability \(P_N(k)\) of absorption into state N starting from state k is: \[P_N(k) = \frac{1-e^{-2ak}}{1-e^{-2aN}}\]

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Consider a discrete time Markov chain with states \(0,1, \ldots, N\) whose matrix has elements $$ P_{i j}=\left\\{\begin{array}{cc} \mu_{i}, & j=i-1 \\ \lambda_{i}, & j=i+1 ; \quad i, j=0,1, \ldots, N . \\ 1-\lambda_{i}-\mu_{i}, & j=i \\ 0, & |j-i|>1 \end{array}\right. $$ Suppose that \(\mu_{0}=\lambda_{0}=\mu_{N}=\lambda_{N}=0\), and all other \(\mu_{i}^{\prime} s\) and \(\lambda_{i}\) 's are positive, and that the initial state of the process is \(k\). Determine the absorption probabilities at 0 and \(N\).

If \(\mathbf{P}\) is a finite Markov matrix, we define \(\mu(\mathbf{P})=\max _{i_{1}, i_{2} j}\left(P_{i_{1}, j}-P_{I_{2}, j}\right)\). Suppose \(P_{1}, P_{2}, \ldots, P_{k}\) are \(3 \times 3\) transition matrices of irreducible aperiodic Markov chains. Asbume furthermore that for any set of integers \(\alpha_{i}\left(1 \leq \alpha_{l} \leq k\right)\), \(i=1,2, \ldots, m, \Pi_{i=1}^{m_{1}} \mathbf{P}_{\alpha_{i}}\) is also the matrix of an aperiodic irreducible Markov chain. Prove that, for every \(\varepsilon>0\), there exists an \(M(\varepsilon)\) such that \(m>M\) implies $$ \mu\left(\prod_{i=1}^{m} \mathbf{P}_{\alpha_{i}}\right)<\varepsilon \quad \text { for any set } \alpha_{i}\left(1 \leq \alpha_{i} \leq k\right) \quad i=1,2, \ldots, m $$

Let \(\mathbf{P}=\left\|\boldsymbol{P}_{i j}\right\|\) be the transition probability matrix of an irreducible Markov chain and suppose \(\mathbf{P}\) is idempotent (i.e., \(\left.\mathbf{P}^{2}=\mathbf{P}\right) .\) Prove that \(P_{i j}=P_{j j}\) for all \(i\) and \(j\) and that the Markov chain is aperiodic.

Let P be a \(3 \times 3\) Markov matrix and define \(\mu(\mathbf{P})=\max _{i_{1}, i_{2}, f}\left[P_{i_{1}, j}-P_{i_{2}, j}\right]\) Show that \(\mu(\mathbf{P})=1\) if and only if \(\mathbf{P}\) has the form $$ \left(\begin{array}{lll} 1 & 0 & 0 \\ 0 & p & q \\ r & s & t \end{array}\right) \quad(p, q \geq 0, p+q=1 ; \quad r, s, t \geq 0, r+s+t=1) $$ or any matrix obtained from this one by interchanging rows and/or columns.

Consider a finite population (of fixed size \(N\) ) of individuals of possible types \(A\) and \(a\) undergoing the following growth process. At instants of time \(t_{1}

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free