Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(\left\\{X_{n}\right\\}\) be a family of r.v.'s and let \(\varphi(\xi)\) be a positive function defined for \(\xi>0\) satisfying $$ \frac{\varphi(\xi)}{\zeta} \rightarrow \infty \quad \text { as } \quad \xi \rightarrow \infty $$ Suppose that $$ \sup _{m \geq 1} E\left[\varphi\left(\left|X_{m}\right|\right)\right] \leq K<\infty $$ Show that \(\left\\{X_{n}\right\\}\) is uniformly integrable.

Short Answer

Expert verified
To show that the family of random variables \(\left\{X_{n}\right\}\) is uniformly integrable, we proved that for any given \(\varepsilon > 0\), there exists a \(\delta > 0\) such that for any event \(A\) with \(P(A) < \delta\), we have \(E[\varphi(\lvert X_n \rvert)|A] < \varepsilon\) for all \(n\), using the given conditions on \(\varphi(\xi)\) and \(\sup_{m \geq 1} E[\varphi(\lvert X_m \rvert)]\).

Step by step solution

01

Use the condition on \(\varphi(\xi)\)

Since \(\frac{\varphi(\xi)}{\zeta} \rightarrow \infty\) as \(\xi \rightarrow \infty\), we know that \(\varphi(\xi)\) increases at least as fast as \(\zeta\) when \(\xi\) goes to infinity. We can use this property later to bound the expectation of \(\varphi(\vert X_n \vert)\).
02

Use the supremum condition

We are given that \(\sup_{m \geq 1} E[\varphi(\lvert X_m \rvert)] \leq K < \infty\). This condition tells us that the expected value of \(\varphi(\lvert X_n \rvert)\) is bounded by a constant \(K\) for all \(n\). We can use this fact to find a suitable value of \(\delta\) for any given \(\varepsilon > 0\).
03

Finding \(\delta\) for a given \(\varepsilon\)

Let \(\varepsilon > 0\) be given. Since \(\sup_{m \geq 1} E[\varphi(\lvert X_m \rvert)] \leq K\), we can choose \(\delta = \frac{\varepsilon}{K}\). We will use this value of \(\delta\) to show that the family \(\left\\{X_{n}\right\\}\) is uniformly integrable.
04

Show that \(E[\varphi(\lvert X_n \rvert)|A] < \varepsilon\) for any event \(A\) with \(P(A) < \delta\)

Let \(A\) be an event with \(P(A) < \delta\). We want to show that \(E[\varphi(\lvert X_n \rvert)|A] < \varepsilon\) for all \(n\). Since \(P(A) < \delta\), we have \(P(A^c) > 1 - \delta\). Also, we know that \(E[\varphi(\lvert X_n \rvert)] \leq K\) for all \(n\). We can write the expectation of \(\varphi(\lvert X_n \rvert)\) as a sum of its values inside and outside the event \(A\): \[ E[\varphi(\lvert X_n \rvert)] = E[\varphi(\lvert X_n \rvert)|A]P(A) + E[\varphi(\lvert X_n \rvert)|A^c]P(A^c) \leq E[\varphi(\lvert X_n \rvert)|A]P(A) + KP(A^c). \] By rearranging this inequality, we have \[ E[\varphi(\lvert X_n \rvert)|A]P(A) \geq E[\varphi(\lvert X_n \rvert)] - KP(A^c) \geq 0. \] Now, we can divide both sides by \(P(A)\) to get \[ E[\varphi(\lvert X_n \rvert)|A] \geq \frac{E[\varphi(\lvert X_n \rvert)]}{P(A)} - K\frac{P(A^c)}{P(A)} \geq \frac{E[\varphi(\lvert X_n \rvert)]}{\delta} - K\frac{1-\delta}{\delta} > \frac{0}{\delta} - K = \varepsilon, \] where the last inequality follows from the fact that \(E[\varphi(\lvert X_n \rvert)] > 0\). Hence, we have shown that for any given \(\varepsilon > 0\) and any event \(A\) with \(P(A) < \delta\), we have \(E[\varphi(\lvert X_n \rvert)|A] < \varepsilon\) for all \(n\). Therefore, the family \(\left\\{X_{n}\right\\}\) is uniformly integrable.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(\left\\{X_{n}\right\\}\) be a martingale satisfying \(E\left[X_{n}^{2}\right] \leq K<\infty\) for all \(n\). Suppose $$ \lim _{n \rightarrow \infty} \sup _{m \sum 1}\left|E\left[X_{n} X_{n+m}\right]-E\left[X_{n}\right] E\left[X_{n+m}\right]\right|=0 $$ Show that \(X=\lim _{n \rightarrow \infty} X_{n}\) is a constant, i.e., nonrandom.

Let \(Y_{1}, Y_{2}, \ldots\) be independent identically distributed positive rando?n variables having finite mean \(\mu\). For fixed \(0<\beta<1\), let \(a\) be the smallest value \(u\) for which \(u \geq \beta E\left[u \vee Y_{1}\right]=\beta E\left[\max \left\\{u, Y_{1}\right\\}\right]\). Set \(f(x)=a \vee x\). Show that \(\left\\{\beta^{\prime \prime} f\left(M_{n}\right)\right\\}\) is a nonegative supermartingale, where \(M_{n}=\max \left\\{Y_{1}, \ldots, Y_{n}\right\\}\) whence \(a=f(0) \geq E\left[\beta^{T} f\left(M_{T}\right)\right]\) for all Markov times \(T\). Finally establish that \(a=E\left[\beta^{T} M_{T *}\right]\) for \(T^{*}=\min \left\\{n \geq 1: Y_{n} \geq a\right\\} .\) Thus, \(T^{*}\) maximizes \(E\left[\beta^{T} M_{T} \mid\right.\) over all Markov times \(T\).

Let \(\left\\{Y_{n}\right\\}\) be a nonnegative submartingale and suppose \(b_{n}\) is a nonincreasing sequence of positive numbers. Suppose \(\sum_{n=1}^{\infty}\left(b_{n}-b_{n+1}\right) E\left[Y_{n}\right]<\infty .\) Prove that $$ \lambda \operatorname{Pr}\left\\{\sup _{k \geq 1} b_{k} Y_{k}>\lambda\right\\}<\sum_{k=1}^{\infty}\left(b_{k}-b_{k+1}\right) E\left[Y_{k}\right] $$

Let \(\xi_{n}\) be nonnegative random variables satisfying $$ E\left[\xi_{n+1} \mid \xi_{1}, \ldots, \xi_{n}\right] \leq \delta_{n}+\xi_{n} $$ where \(\delta_{n} \geq 0\) are constants and \(\Delta=\sum_{n=1}^{\infty} \delta_{n}<\infty .\) Show that with probability one, \(\xi_{n}\) converges to a finite random variable \(\xi\) as \(n \rightarrow \infty\).

Let \(\left\\{X_{n}\right\\}\) be a martingale for which \(Y=\sup _{n}\left|X_{n+1}-X_{n}\right|\) has a finite mean. Let \(A_{1}\) be the event that \(\left\\{X_{n}\right\\}\) converges and \(A_{2}\) the event that \(\lim \sup X_{n}=+\infty\) and \(\lim \inf X_{n}=-\infty .\) Show that \(\operatorname{Pr}\left\\{A_{1}\right\\}+\operatorname{Pr}\left\\{A_{2}\right\\}=1\). In words, \(\left\\{X_{n}\right\\}\) either converges, or oscillates very greatly indeed.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free