Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Suppose \(P=\left\|P_{1 j}\right\|\) is the transition probability matrix of an irreducible recurrent Markov chain \(\left\\{X_{n}\right\\} .\) Use the supermartingale convergence theorem (see Remark 5.1) to show that every nonnegative solution \(y=\\{y(i)\\}\) to the system of inequalities \(y(i) \geq \sum_{j=0}^{\infty} P_{i j} y(j), \quad\) for all \(i\) is constant.

Short Answer

Expert verified
We started by defining a sequence of random variables Y_n as the solution y evaluated at each state of the Markov chain. We then showed that Y is a supermartingale, as \(E\left[Y_{n+1} |Y_n\right] \leq Y_n\). By applying the supermartingale convergence theorem, we deduced that Y has a limit Y_∞ almost surely. Since Y cannot decrease indefinitely and y is non-negative, there must be a constant c such that \(Y_\infty = c\) almost surely. This implies that the function y is constant, proving that every nonnegative solution y to the given system of inequalities must be constant.

Step by step solution

01

Formulate the supermartingale

Let Y = \(\\{Y_n\\}\) be a sequence of random variables defined as, Y_n = \(y(X_n)\), where y is the solution to the system of inequalities and \(X_n\) is the Markov chain. Notice that, for any i, \(E\left[Y_{n+1} | X_n = i\right] = \sum_j P_{ij} y(j) \leq y(i)= Y_n\) Since the Markov chain is independent of y, so we can write this inequality as: \(E\left[Y_{n+1} |Y_n\right] \leq Y_n\) This shows that Y is a supermartingale.
02

Apply the supermartingale convergence theorem

Using the supermartingale convergence theorem, we know that Y has a limit Y_∞ almost surely as n approaches infinity because the Markov chain is irreducible and recurrent. That is, \(P\left(\lim_{n \to \infty} Y_n = Y_{\infty}\right) = 1\). Since Y has a limit, it is impossible for the values of Y to oscillate, they must either keep increasing or keep decreasing. However, since we have the inequality \(E\left[Y_{n+1} | Y_n\right] \leq Y_n\), by the supermartingale property, it is clear that the values of Y can only keep decreasing. But since y is non-negative, Y cannot decrease indefinitely, so there must be a constant c such that the limit \(Y_\infty = c\) almost surely.
03

Conclude that y is constant

Since we know that the limit exists and that \(Y_\infty = c\) almost surely, it follows that the function y must be constant. Otherwise, there would be two different states i and j for which y(i) ≠ y(j), leading to oscillations, which would contradict our conclusion in step 2. Thus, we have shown that every nonnegative solution y to the given system of inequalities must be constant.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

The Haar functions on \([0,1)\) are defined by $$ \begin{gathered} H_{1}(t)=1 \\ H_{2}(t)= \begin{cases}1, & 0 \leq t<\frac{1}{2}, \\ -1, \quad \frac{1}{2} \leq t<1,\end{cases} \\ H_{2^{n+1}}(t)=\left\\{\begin{array}{cl} 2^{n / 2}, & 0 \leq t<2^{-(n+1)}, \\ -2^{n / 2} & 2^{-(n+1)} \leq t<2^{-n}, \quad n=1,2, \ldots, \\ 0, & \text { otherwise } \end{array}\right. \\ H_{2^{n+}}(t)=H_{2^{n}+1}\left(t-\frac{j-1}{2^{n}}\right) . \quad j=1, \ldots, 2^{n} \end{gathered} $$ It helps to plot the first five. Let \(f(z)\) be an arbitrary function on \([0,1]\) but satisfying $$ \int_{0}^{1}|f(z)| d z<\infty $$ Define \(a_{k}=\int_{0}^{1} f(t) H_{k}(t) d t\). Let \(Z\) be uniformly distributed on \([0,1]\). Show that and $$ f(Z)=\lim _{n \rightarrow \infty} \sum_{k=1}^{n} a_{k} H_{k}(Z) \quad \text { with probability one, } $$ $$ \lim _{n \rightarrow \infty} \int_{0}^{1}\left|f(t)-\sum_{k=1}^{n} a_{k} H_{k}(t)\right| d t=0 $$

Let \(\left\\{X_{n}\right\\}\) be a submartingale. Show that $$ \lambda \operatorname{Pr}\left\\{\min _{0 \leq k \leq n} X_{k}<-\lambda\right\\} \leq E\left[X_{n}^{+}\right]-E\left[X_{0}\right], \quad \lambda>0 $$

Let \(X, X_{1}, X_{2}, \ldots\) be independent identically distributed random varialhi'm having negative mean \(\mu\) and finite variance \(\sigma^{2}\). With \(S_{0}=0\) and \(S_{n}=X_{1}\) ? \(\cdots+X_{n}\), set \(M=\max _{n \geq 0} S_{n} .\) In view of \(\mu<0\), we know that \(M<\infty .\) Assumi \(E[M]<\infty\). (In fact, it can be shown that this is a consequence of \(\sigma^{2}<\infty\).) Define \(r(x)=x^{+}=\max \\{x, 0\\}\) and \(f(x)=E\left[(x+M-E[M])^{+}\right]\) (a) Show \(f(x) \geq r(x)\) for all \(x\). (b) Show \(f(x) \geq E[f(x+X)]\) for all \(x\), so that \(\left\\{f\left(x+S_{n}\right)\right\\}\) is a nonnegative supermartingale [Hint: Verify and use the fact that \(M\) and \((X+M)^{+}\)have the same distribution.] (c) Use (a) and (b) to show \(f(x) \geq E\left[\left(x+S_{T}\right)^{+}\right]\)for all Markov times \(T .\left[\left(x+S_{\infty}\right)^{+}=\lim _{n \rightarrow \infty}\left(X+S_{n}\right)^{+}=0 .\right]\)

10\. Let \(\left\\{X_{n}\right\\}\) be a martingale for which \(E\left[X_{n}\right]=0\) and \(E\left[X_{n}^{2}\right]<\infty\) for all \(n\). Show that $$ \operatorname{Pr}\left\\{\underset{0 \leq k \leq n}{\max } X_{k}>\lambda\right\\} \leq \frac{E\left[X_{n}^{2}\right]}{E\left[X_{n}^{2}\right]+\lambda^{2}}, \quad \lambda>0 $$

Suppose \(X_{1}, X_{2}, \ldots\) are independent random variables having finite moment generating functions \(\varphi_{k}(t)=E\left[\exp \left\\{t X_{k}\right\\}\right]\). Show, if \(\Phi_{n}\left(t_{0}\right)=\prod_{k=1}^{n} \varphi_{k}\left(t_{0}\right) \rightarrow\) \(\mathrm{D}\left(t_{0}\right)\) as \(n \rightarrow \infty, t_{0} \neq 0\) and \(0<\Phi\left(t_{0}\right)<\infty\), then \(S_{n}=X_{1}+\cdots+X_{n}\) converges with prohability one.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free