Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(\left\\{X_{n}\right\\}\) be a success runs Markov chain having transition probabilities \(P_{i, i+1}=p_{i}=1-P_{i, 0}\), for \(i=0,1, \ldots\) Suppose \(01 \geq(a+1) \beta_{P a} / a .\) Define $$ f(i)=\left\\{\begin{array}{lll} a \beta^{a-t} p_{i} \cdot p_{i+1} \cdots p_{a-1}, & \text { for } \quad i

Short Answer

Expert verified
#tag_title# Part (b) - Prove \(f(i) \geq \beta E\left[f\left(X_{n}\right) \mid X_{n-1}=i\right]\) #tag_content# To show that \(f(i) \geq \beta E\left[f\left(X_{n}\right) \mid X_{n-1}=i\right]\), note that \(\beta E\left[f\left(X_{n}\right) \mid X_{n-1}=i\right] = \beta \left[ p_i f(i+1) + (1-p_i) f(0)\right]\), since the Markov chain can either transition to state \(i+1\) with probability \(p_i\) or reset to state 0 with probability \(1-p_i\). Now consider two cases: Case 1: \(i < a\) \(\beta E\left[f\left(X_{n}\right) \mid X_{n-1}=i\right] = \beta \left[ p_i a \beta^{a-i-1} p_{i+1} \cdot ... \cdot p_{a-1} + (1-p_i) f(0)\right] \leq \beta\left[p_i a \beta^{a-i-1} p_i + (1-p_i) a \beta^{a-i} p_i\right] = f(i)\), because \(f(0) \leq a \) and \(\beta^{a-i-1} \geq \beta^{a-i}\), as \(0 < \beta < 1\). Case 2: \(i \geq a\) \(\beta E\left[f\left(X_{n}\right) \mid X_{n-1}=i\right] = \beta \left[ p_i (i+1) + (1-p_i) f(0)\right] \leq \beta \left[p_i (i+1) + (1-p_i) i\right] = \beta i\), since \(f(0) \leq i\) in this case. As \(f(i) = i \geq \beta i\), the inequality holds. In both cases, we have shown that \(f(i) \geq \beta E\left[f\left(X_{n}\right) \mid X_{n-1}=i\right]\), proving the desired result.

Step by step solution

01

Part (a) - Prove \(f(i) \geq i\) for all \(i\)

To show that \(f(i) \geq i\) for all \(i\), we will break it down into two cases: when \(i < a\) and when \(i \geq a\). Case 1: \(i < a\) \(f(i) = a \beta^{a-i} p_i \cdot p_{i+1} \cdot ... \cdot p_{a-1}\) Since \(p_i \geq p_{i+1} \geq ... \geq p_{a-1} > 0\), we have \(f(i) \geq a \beta^{a-i} p_i = (a-i) \cdot (\frac{a \beta^{a-i}}{a-i} \cdot p_i) \geq (a-i) \cdot (\frac{a+1}{a} \cdot \frac{a \beta^{a-i}}{a-1} \cdot p_{a-1}) \geq (a-i)\), as defined. Since \(i < a\), \(a-i > 0\) and thus \(f(i) \geq i\). Case 2: \(i \geq a\) \(f(i) = i\) In this case, \(f(i) = i\) by definition, so the inequality holds. Thus, we have shown that \(f(i) \geq i\) for all \(i\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(\left\\{X_{n}\right\\}\) be a submartingale. Strengthen the maximal inequality, Lemma S.I., to $$ \begin{aligned} \lambda \operatorname{Pr}\left\\{\max _{0 \leq k \leq n} X_{k}>\lambda\right\\} & \leq E\left[X_{n} I\left\\{\max _{0 \leq k \leq n} X_{k}>\lambda\right\\}\right] \\ & \leq E\left[X_{n}^{+}\right] \leq E\left[\left|X_{n}\right|\right], \quad \lambda>0 \end{aligned} $$

Let \(X_{n}\) be the total assets of an insurance company at the end of year \(n\). In each year, \(n\), premiums totaling \(b>0\) are received, and claims \(A_{n}\) are paid, 8o \(X_{n+1}=X_{n}+b-A_{n} .\) Assume \(A_{1}, A_{2}, \ldots\) are independent random variables, each normally distributed with mean \(\mu

Let \(Z, Y_{0}, Y_{1}, \ldots\) be jointly distributed random variables and assume \(E\left[|Z|^{2}\right]<\infty .\) Show that \(X_{n}=E\left[Z \mid Y_{0}, \ldots, Y_{n}\right]\) satisfies the conditions for the martingale mean square convergence theorem.

Let \(\varphi(\xi)\) be a symmetric function, nondecreasing in \(|\xi|\), with \(\varphi(0)=0\), and such that \(\left\\{\varphi\left(\dot{Y}_{j}\right)\right\\}_{j=0}^{n}\) is a submartingale. Fix \(0=u_{0} \leq u_{1} \leq \cdots \leq u_{n}\). Show that $$ \operatorname{Pr}\left\\{\left|Y_{j}\right| \leq u_{j} ; 1 \leq j \leq n\right\\} \geq 1-\sum_{j=1}^{n} \frac{E\left[\varphi\left(Y_{j}\right)\right]-E\left[\varphi\left(Y_{j-1}\right)\right]}{\varphi\left(u_{j}\right)} $$ (If \(\varphi(\xi)=\zeta^{2}, u_{1}=\cdots=u_{n}=\lambda\), we obtain Kolmogorov's inequality.)

Let \(\left\\{U_{n}\right\\}\) and \(\left\\{V_{A}\right\\}\) be martingales with respect to the same process \(\left\\{Y_{n}\right\\}\). Suppose \(U_{0}=V_{0}=0\) and \(E\left[U_{n}^{2}\right]<\infty, E\left[V_{n}^{2}\right]<\infty\) for all \(n\). Show $$ E\left[U_{n} V_{n}\right]=\sum_{k=1}^{n} E\left[\left(U_{k}-U_{k-1}\right)\left(V_{k}-V_{k-1}\right)\right] $$ As a special case, $$ E\left[U_{n}^{2}\right]=\sum_{k=1}^{n} E\left[\left(U_{k}-U_{k-1}\right)^{2}\right] $$

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free