Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Prove: if \(\left\\{X_{n}\right\\}\) is a submartingale and \(\varphi(x)\) is a convex, increasing function. then \(\left\\{\varphi\left(X_{n}\right)\right\\}\) is a submartingale whenever \(\boldsymbol{E}\left|\varphi^{+}\left(\boldsymbol{X}_{n}\right)\right|<\infty\) for all \(n\) (cf. Lemma 2.2).

Short Answer

Expert verified
Given that \(\{X_n\}\) is a submartingale and \(\varphi(x)\) is a convex, increasing function, we have used Jensen's inequality to obtain the inequality \(\mathbb{E}[\varphi(X_{n+1}) | X_1, ..., X_n] \geq \varphi(X_n)\). With the given condition, \(\mathbb{E}\left|\varphi^+\left(X_{n}\right)\right| < \infty\) for all \(n\), we can conclude that \(\{\varphi(X_n)\}\) is a submartingale.

Step by step solution

01

Recall the definition of a submartingale

A sequence of random variables \(\{X_n\}\) is a submartingale if for every \(n \in \mathbb{N}\), $$ \mathbb{E}[X_{n+1} | X_1, ..., X_n] \geq X_n. $$ Our goal is to show that \(\{\varphi(X_n)\}\) satisfies the same property.
02

Use Jensen's inequality

Since \(\varphi(x)\) is convex and increasing, we can apply Jensen's inequality. For any \(n \in \mathbb{N}\), we have $$ \mathbb{E}[\varphi(X_{n+1}) | X_1, ..., X_n] \geq \varphi(\mathbb{E}[X_{n+1}| X_1, ..., X_n]). $$
03

Apply the submartingale property

As \(\{X_n\}\) is a submartingale, we have $$ \mathbb{E}[X_{n+1} | X_1, ..., X_n] \geq X_n. $$ Combining this with the result from Step 2 using the increasing property of \(\varphi\), we get $$ \mathbb{E}[\varphi(X_{n+1}) | X_1, ..., X_n] \geq \varphi(X_n). $$
04

Check the condition on the expected value of the positive part

We need to verify that \(\mathbb{E}\left|\varphi^+\left(X_{n}\right)\right| < \infty\) for all \(n\). As \(\varphi(x)\) is increasing, its positive part \(\varphi^+(x)\) is also increasing. So, for any non-negative random variable \(X_n\), the positive part of \(\varphi\left(X_{n}\right)\), \(\varphi^+(X_n)\), will also be non-negative. We are given that \(\mathbb{E}\left|\varphi^+\left(X_{n}\right)\right| < \infty\) for all \(n\), which means that \(\mathbb{E}[\varphi^+(X_n)] < \infty\) for all \(n\). Since \(\varphi^+(X_n)\) is non-negative, this implies that \(\mathbb{E}[\varphi(X_n)] < \infty\) for all \(n\).
05

Conclude that \(\{\varphi(X_n)\}\) is a submartingale

Since the positive part of \(\varphi\left(X_{n}\right)\) is non-negative and has a finite expected value, and the property $$ \mathbb{E}[\varphi(X_{n+1}) | X_1, ..., X_n] \geq \varphi(X_n) $$ holds for all \(n\), we can conclude that \(\{\varphi(X_n)\}\) is a submartingale.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

Key Concepts

These are the key concepts you need to understand to accurately answer the question.

Jensen's Inequality
At the heart of our submartingale problem lies Jensen's Inequality, a fundamental concept in probability and statistics. This inequality tells us about the behavior of a convex function when applied to expectations.

Specifically, if you have a convex function, \( \phi(x) \), and a random variable, \( X \), Jensen's Inequality guarantees that the expected value of \( \phi(X) \) is greater than or equal to \( \phi(\text{E}[X]) \). In mathematical terms:
\[ \mathbb{E}[\phi(X)] \geq \phi(\mathbb{E}[X]) \].

This is crucial when dealing with submartingales because it allows us to compare the expected value of the transformed sequence with the original sequence. By ensuring that the function \( \phi \) does not 'reverse' the inequality present in the submartingale definition, we can retain the submartingale property even after the transformation.
Convex Function
Understanding what makes a function convex is critical for applying Jensen's inequality. A function \( f(x) \) is considered convex if, for any two points on the graph of \( f \) and any \( \lambda \) such that \( 0 \leq \lambda \leq 1 \), the following inequality holds:
\[ f(\lambda x + (1-\lambda) y) \leq \lambda f(x) + (1-\lambda) f(y) \].

Visually, this means that if you take any two points on the graph of \( f \), the line segment connecting them lies above or on the graph. In the context of our submartingale problem, we leverage the property that a convex function \( \phi \) turns line segments into arcs that do not fall below the line—this is what allows us to use Jensen's Inequality and maintain the condition needed for a submartingale.
Expected Value
The expected value, often denoted as \( \mathbb{E}[X] \), is the probability-weighted average of all possible values for a random variable \( X \). It's a measure of the 'center' of the random variable's distribution, or intuitively, what you would expect on average if you could repeat an experiment an infinite number of times.

In submartingales, the expected value plays a central role. By definition, for a sequence of random variables to be a submartingale, the expected value of the next variable in the sequence given all previous values must be at least as large as the present value. This implies a sort of 'increasing trend' on average, which is formalized as:
\[ \mathbb{E}[X_{n+1} | X_1, ..., X_n] \geq X_n \].

When assessing if a transformed process via a convex function retains the submartingale property, we examine if this inequality remains true under the transformation. Hence, the expected value provides a convenient and insightful comparison point.
Stochastic Processes
A stochastic process is a sequence of random variables representing a system evolving over time. These variables can represent anything from stock prices to weather patterns, as long as there's inherent randomness in their evolution.

Submartingales are a special kind of stochastic process. They are characterized by the property that, given the present state, the expected future value is not less than the current value. It reflects a tendency for the process to 'drift' upwards over time.

The concept of a submartingale is deeply connected to the expectations and the conditions a random process must satisfy. To determine whether a transformed process \( \{\phi(X_n)\} \) is a submartingale, we ensure that each term, when given previous information, carries the expectation of being larger or equal to the current term, maintaining the essential characteristic of a submartingale even under transformation.

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(\left\\{X_{n}\right\\}\) be a martingale satisfying \(E\left[X_{n}^{2}\right] \leq K<\infty\) for all \(n\). Suppose $$ \lim _{n \rightarrow \infty} \sup _{m \sum 1}\left|E\left[X_{n} X_{n+m}\right]-E\left[X_{n}\right] E\left[X_{n+m}\right]\right|=0 $$ Show that \(X=\lim _{n \rightarrow \infty} X_{n}\) is a constant, i.e., nonrandom.

Fix \(\lambda>0 .\) Suppose \(X_{1}, X_{2}, \ldots\) are jointly distributed random variables whose joint distributions satisfy $$ E\left[\exp \left\\{\lambda X_{n+1}\right\\} \mid X_{1}, \ldots, X_{n}\right] \leq 1, \quad \text { for all } n $$ Ix-t. \(S_{n}=X_{1}+\cdots+X_{n}\left(S_{0}=0\right)\). Establish $$ \operatorname{Pr}\left\\{\sup _{n \geq 0}\left(x+S_{n}\right)>l\right\\} \leq e^{-\lambda(t-x)}, \quad \text { for } x \leq l $$

Let \(\Omega=\left\\{\omega_{1}, \omega_{2}, \ldots\right\\}\) be a countable set and \(\mathscr{F}\) the \(\sigma\)-field of all subsets of \(\Omega\). For a fixed \(N\), let \(X_{0}, X_{1}, \ldots, X_{N}\) be random variables defined on \(\Omega\) and let \(T\) be a Markov time with respect to \(\left\\{X_{n}\right\\}\) satisfying \(0 \leq T \leq N\). Let \(\mathscr{F}_{n}\) be the \(\sigma\)-field generated by \(X_{0}, X_{1}, \ldots, X_{n}\) and define \(\mathscr{F}_{T}\) to be the collection of sets \(A\) in \(\mathscr{F}\) for which \(A \cap\\{T=n\\}\) is in \(\mathscr{F}_{n}\) for \(n=0, \ldots, N\). That is, $$ \mathscr{F}_{T}=\left\\{A: A \in F \quad \text { and } A \cap\\{T=n\\} \in F_{n}, \quad n=0, \ldots, N\right\\} $$ Show: (a) \(\mathscr{F}_{T}\) is a \(\sigma\)-field, (b) \(T\) is measurable with respect to \(\mathscr{F}_{T}\), (c) \(\mathscr{F}_{T}\) is the \(\sigma\)-field generated by \(\left\\{X_{0}, \ldots, X_{T}\right\\}\), where \(\left\\{X_{0}, \ldots, X_{T}\right\\}\) is considered to be a variable-dimensional vector-valued function defined on \(\Omega\).

Suppose \(P=\left\|P_{1 j}\right\|\) is the transition probability matrix of an irreducible recurrent Markov chain \(\left\\{X_{n}\right\\} .\) Use the supermartingale convergence theorem (see Remark 5.1) to show that every nonnegative solution \(y=\\{y(i)\\}\) to the system of inequalities \(y(i) \geq \sum_{j=0}^{\infty} P_{i j} y(j), \quad\) for all \(i\) is constant.

Let \(\left\\{X_{n}\right\\}\) be a family of r.v.'s and let \(\varphi(\xi)\) be a positive function defined for \(\xi>0\) satisfying $$ \frac{\varphi(\xi)}{\zeta} \rightarrow \infty \quad \text { as } \quad \xi \rightarrow \infty $$ Suppose that $$ \sup _{m \geq 1} E\left[\varphi\left(\left|X_{m}\right|\right)\right] \leq K<\infty $$ Show that \(\left\\{X_{n}\right\\}\) is uniformly integrable.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free