Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

A transition probability matrix \(\mathbf{P}\) is said to be doubly stochastic if the sum over each column equals one; that is, $$ \sum_{i} P_{i j}=1, \quad \text { for all } j $$ If such a chain is irreducible and aperiodic and consists of \(M+1\) states \(0,1, \ldots, M\), show that the limiting probabilities are given by $$ \pi_{j}=\frac{1}{M+1}, \quad j=0,1, \ldots, M $$

Short Answer

Expert verified
The limiting probabilities for the given irreducible, aperiodic, and doubly stochastic Markov chain with \(M+1\) states are \(\pi_j = \frac{1}{M+1}\) for \(j = 0,1, \ldots, M\).

Step by step solution

01

Understanding the properties of the given Markov chain

It is given that the transition probability matrix \(\mathbf{P}\) is doubly stochastic, which means the sum of probabilities in each column equals 1. Also, the chain is irreducible and aperiodic, implying that it is possible to reach any state from any state in a finite number of steps, and there is no fixed period for which the chain returns to the same state.
02

Writing the balance equations

In order to find the limiting probabilities \(\pi_j\), we need the balance equations, which are given by: $$ \pi_j = \sum_{i=0}^M \pi_i P_{ij}, \quad j = 0,1, \ldots, M $$
03

Proving that all limiting probabilities are equal

Since the matrix is doubly stochastic, we know that: $$ \sum_{i=0}^M P_{ij} = 1, \quad j = 0,1, \ldots, M $$ Substitute the balance equations into this equation: $$ \sum_{i=0}^M \pi_i P_{ij} = \sum_{i=0}^M \pi_i = 1, \quad j = 0,1, \ldots, M $$ Notice that the sum of the limiting probabilities is the same for each column, indicating that all the limiting probabilities are equal. To represent this, we can write: $$ \pi_j = \pi, \quad j = 0,1, \ldots, M $$
04

Finding the limiting probabilities

Since we know all limiting probabilities are equal, we can sum up all the probabilities and set them equal to 1. This will give us the value of \(\pi\): $$ (M+1) \pi = 1 $$ Solve for \(\pi\): $$ \pi = \frac{1}{M+1} $$ Therefore, the limiting probabilities are given by: $$ \pi_j = \frac{1}{M+1}, \quad j = 0,1, \ldots, M $$

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Each day, one of \(n\) possible elements is requested, the ith one with probability \(P_{i}, i \geqslant 1, \sum_{1}^{n} P_{i}=1\). These elements are at all times arranged in an ordered list that is revised as follows: The element selected is moved to the front of the list with the relative positions of all the other elements remaining unchanged. Define the state at any time to be the list ordering at that time and note that there are \(n !\) possible states. (a) Argue that the preceding is a Markov chain. (b) For any state \(i_{1}, \ldots, i_{n}\) (which is a permutation of \(\left.1,2, \ldots, n\right)\), let \(\pi\left(i_{1}, \ldots, i_{n}\right)\) denote the limiting probability. In order for the state to be \(i_{1}, \ldots, i_{n}\), it is necessary for the last request to be for \(i_{1}\), the last non- \(i_{1}\) request for \(i_{2}\), the last non- \(i_{1}\) or \(i_{2}\) request for \(i_{3}\), and so on. Hence, it appears intuitive that $$ \pi\left(i_{1}, \ldots, i_{n}\right)=P_{i_{1}} \frac{P_{i_{2}}}{1-P_{i_{1}}} \frac{P_{i_{3}}}{1-P_{i_{1}}-P_{i_{2}}} \cdots \frac{P_{i_{n-1}}}{1-P_{i_{1}}-\cdots-P_{i_{n-2}}} $$ Verify when \(n=3\) that the preceding are indeed the limiting probabilities.

A Markov chain is said to be a tree process if (i) \(\quad P_{i j}>0\) whenever \(P_{j i}>0\), (ii) for every pair of states \(i\) and \(j, i \neq j\), there is a unique sequence of distinct states \(i=i_{0}, i_{1}, \ldots, i_{n-1}, i_{n}=j\) such that $$ P_{i_{k}, i_{k+1}}>0, \quad k=0,1, \ldots, n-1 $$ In other words, a Markov chain is a tree process if for every pair of distinct states \(i\) and \(j\) there is a unique way for the process to go from \(i\) to \(j\) without reentering a state (and this path is the reverse of the unique path from \(j\) to \(i\) ). Argue that an ergodic tree process is time reversible.

Consider the Ehrenfest urn model in which \(M\) molecules are distributed between two urns, and at each time point one of the molecules is chosen at random and is then removed from its urn and placed in the other one. Let \(X_{n}\) denote the number of molecules in urn 1 after the \(n\) th switch and let \(\mu_{n}=E\left[X_{n}\right]\). Show that (a) \(\mu_{n+1}=1+(1-2 / M) \mu_{n}\). (b) Use (a) to prove that $$ \mu_{n}=\frac{M}{2}+\left(\frac{M-2}{M}\right)^{n}\left(E\left[X_{0}\right]-\frac{M}{2}\right) $$

A certain town never has two sunny days in a row. Each day is classified as being either sunny, cloudy (but dry), or rainy. If it is sunny one day, then it is equally likely to be either cloudy or rainy the next day. If it is rainy or cloudy one day, then there is one chance in two that it will be the same the next day, and if it changes then it is equally likely to be either of the other two possibilities. In the long run, what proportion of days are sunny? What proportion are cloudy?

A professor continually gives exams to her students. She can give three possible types of exams, and her class is graded as either having done well or badly. Let \(p_{i}\) denote the probability that the class does well on a type \(i\) exam, and suppose that \(p_{1}=0.3, p_{2}=0.6\), and \(p_{3}=0.9 .\) If the class does well on an exam, then the next exam is equally likely to be any of the three types. If the class does badly, then the next exam is always type \(1 .\) What proportion of exams are type \(i, i=1,2,3 ?\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free