Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(\varphi(\xi)\) be a symmetric function, nondecreasing in \(|\xi|\), with \(\varphi(0)=0\), and such that \(\left\\{\varphi\left(\dot{Y}_{j}\right)\right\\}_{j=0}^{n}\) is a submartingale. Fix \(0=u_{0} \leq u_{1} \leq \cdots \leq u_{n}\). Show that $$ \operatorname{Pr}\left\\{\left|Y_{j}\right| \leq u_{j} ; 1 \leq j \leq n\right\\} \geq 1-\sum_{j=1}^{n} \frac{E\left[\varphi\left(Y_{j}\right)\right]-E\left[\varphi\left(Y_{j-1}\right)\right]}{\varphi\left(u_{j}\right)} $$ (If \(\varphi(\xi)=\zeta^{2}, u_{1}=\cdots=u_{n}=\lambda\), we obtain Kolmogorov's inequality.)

Short Answer

Expert verified
Therefore, the proof of the given inequality is established as \( \operatorname{Pr}\left\{\left|Y_{j}\right| \leq u_{j} ; 1 \leq j \leq n\right\} \geq 1-\sum_{j=1}^{n} \frac{E\left[\varphi\left(Y_{j}\right)\right]-E\left[\varphi\left(Y_{j-1}\right)\right]}{\varphi\left(u_{j}\right)} \)

Step by step solution

01

Write out the properties of the function and submartingales

From the given conditions, it is known that \( \varphi(\xi) \) is symmetric and nondecreasing in \( |\xi| \), with \( \varphi(0) = 0 \). Also, given that \( \left\{\varphi\left(\dot{Y}_{j}\right)\right\}_{j=0}^{n} \) is a submartingale, we can say that the expected value of \( \dot{Y}_{j} \) conditioned on the previous \(\dot{Y}_{j-1}\) increases or stays constant, that is, \( E\left[\varphi\left(\dot{Y}_{j}\right) | \dot{Y}_{j-1}\right] \geq \varphi\left(\dot{Y}_{j-1}\right) \).
02

Analyze the Probability

We need to consider the probability \( Pr\left\{\left|Y_{j}\right| \leq u_{j} ; 1 \leq j \leq n\right\} \). We can use the indicator function to represent this event as \( 1_{\left\{ \left|Y_{j}\right| \leq u_{j}\right\}} \). Notice that the function \(\varphi(\xi)\) is non-decreasing, so we can say \( \varphi(\left|Y_{j}\right|)\) is less than or equal to \( \varphi(u_{j}) \) when \( \left|Y_{j}\right| \leq u_{j} \) i.e. \( \varphi(\left|Y_{j}\right|) 1_{\left\{ \left|Y_{j}\right| \leq u_{j}\right\}} \leq \varphi(u_{j})1_{\left\{ \left|Y_{j}\right| \leq u_{j}\right\}} \)
03

Apply the submartingale property

Take the expectation on both sides and apply the submartingale property: \( E\left[\varphi(\left|Y_{j}\right|)\right]-E\left[\varphi(\left|Y_{j-1}\right|)\right] \leq E\left[\varphi\left(u_{j}\right) 1_{\left\{ \left|Y_{j}\right| \leq u_{j}\right\}} \right] - E\left[\varphi\left(u_{j-1}\right) 1_{\left\{ \left|Y_{j-1}\right| \leq u_{j-1}\right\}} \right] \). Now, if we denote \( p_{j} = Pr\left\{ \left|Y_{j}\right| \leq u_{j} \right\} \), we can relate the inequality to the probability.
04

Relate inequality to probability

If we multiply both sides by \( \frac{1}{\varphi\left(u_{j}\right)} \) we get, \( \frac{E\left[\varphi\left(Y_{j}\right)\right]-E\left[\varphi\left(Y_{j-1}\right)\right]}{\varphi\left(u_{j}\right)} \leq p_{j} - p_{j-1} \). Then sum over \( j=1,2,...,n \), and we obtain \( 1- \sum_j \frac{E\left[\varphi\left(Y_{j}\right)\right]-E\left[\varphi\left(Y_{j-1}\right)\right]}{\varphi\left(u_{j}\right)} \leq p_{n} \), which leads to the desired inequality.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(\left\\{X_{n}\right\\}\) be a family of r.v.'s and let \(\varphi(\xi)\) be a positive function defined for \(\xi>0\) satisfying $$ \frac{\varphi(\xi)}{\zeta} \rightarrow \infty \quad \text { as } \quad \xi \rightarrow \infty $$ Suppose that $$ \sup _{m \geq 1} E\left[\varphi\left(\left|X_{m}\right|\right)\right] \leq K<\infty $$ Show that \(\left\\{X_{n}\right\\}\) is uniformly integrable.

Consider a population of organisms living in some bounded environment, say the Earth. Let \(X_{n}\) be the number of organisms alive at time \(n\) and observe that \(\\{0\\}\) is an absorbing state, \(X_{n}=0\) implies \(X_{n+m}=0\) for all \(m\). It is reasonable to suppose that for every \(N\) there exists \(\delta>0\) satisfying $$ \operatorname{Pr}\left[X_{n+1}=0 \mid X_{1}, \ldots, X_{n}\right] \geq \delta, \quad \text { if } \quad X_{n} \leq N $$ \(n=1,2, \ldots\) Let \(\&\) be the event of eventual extinction $$ \varepsilon=\left\\{X_{k}=0 \text { for some } k=1,2, \ldots\right\\} $$ Show that with probability one, either \(\&\) occurs or else \(X_{n} \rightarrow \infty\) as \(n \rightarrow \infty\). Since the latter cannot occur in a bounded environment, eventual extinction is certain.

Suppose \(X_{1}, X_{2}, \ldots\) are independent random variables having finite moment generating functions \(\varphi_{k}(t)=E\left[\exp \left\\{t X_{k}\right\\}\right]\). Show, if \(\Phi_{n}\left(t_{0}\right)=\prod_{k=1}^{n} \varphi_{k}\left(t_{0}\right) \rightarrow\) \(\mathrm{D}\left(t_{0}\right)\) as \(n \rightarrow \infty, t_{0} \neq 0\) and \(0<\Phi\left(t_{0}\right)<\infty\), then \(S_{n}=X_{1}+\cdots+X_{n}\) converges with prohability one.

Suppose \(Y\) is \(\mathscr{B}\)-measurable, and \(E[|Y|]<\infty\). Show that \(E[Y Z] \geq 0\) for all bounded nonnegative \(\mathscr{B}\)-measurable random variables \(Z\) implies \(P[\\{\omega: Y(\omega) \geq 0\\}]=1\).

Let \(Y_{1}, Y_{2}, \ldots\) be independent random variables with \(\operatorname{Pr}\left\\{Y_{k}=+1\right\\}\) \(\operatorname{Pr}\left\\{Y_{k}=-1\right\\}=1 / 2 .\) Put \(S_{k}=Y_{1}+\cdots+Y_{k} .\) Show that $$ \operatorname{Pr}\left\\{S_{k}

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free