Chapter 6: Problem 14
Suppose \(X_{1}\) and \(X_{2}\) are \(\mathscr{\text { -measurable random variables. Show that }}\) \(a_{1} X_{1}+a_{2} X_{2}\) is \(\mathscr{B}\)-measurable for all real \(a_{1}, a_{2}\).
Chapter 6: Problem 14
Suppose \(X_{1}\) and \(X_{2}\) are \(\mathscr{\text { -measurable random variables. Show that }}\) \(a_{1} X_{1}+a_{2} X_{2}\) is \(\mathscr{B}\)-measurable for all real \(a_{1}, a_{2}\).
All the tools & learning materials you need for study success - in one app.
Get started for freeThe Haar functions on \([0,1)\) are defined by $$ \begin{gathered} H_{1}(t)=1 \\ H_{2}(t)= \begin{cases}1, & 0 \leq t<\frac{1}{2}, \\ -1, \quad \frac{1}{2} \leq t<1,\end{cases} \\ H_{2^{n+1}}(t)=\left\\{\begin{array}{cl} 2^{n / 2}, & 0 \leq t<2^{-(n+1)}, \\ -2^{n / 2} & 2^{-(n+1)} \leq t<2^{-n}, \quad n=1,2, \ldots, \\ 0, & \text { otherwise } \end{array}\right. \\ H_{2^{n+}}(t)=H_{2^{n}+1}\left(t-\frac{j-1}{2^{n}}\right) . \quad j=1, \ldots, 2^{n} \end{gathered} $$ It helps to plot the first five. Let \(f(z)\) be an arbitrary function on \([0,1]\) but satisfying $$ \int_{0}^{1}|f(z)| d z<\infty $$ Define \(a_{k}=\int_{0}^{1} f(t) H_{k}(t) d t\). Let \(Z\) be uniformly distributed on \([0,1]\). Show that and $$ f(Z)=\lim _{n \rightarrow \infty} \sum_{k=1}^{n} a_{k} H_{k}(Z) \quad \text { with probability one, } $$ $$ \lim _{n \rightarrow \infty} \int_{0}^{1}\left|f(t)-\sum_{k=1}^{n} a_{k} H_{k}(t)\right| d t=0 $$
Suppose \(S_{n}=X_{1}+\cdots+X_{n}\) is a zero-mean martingale for which \(E\left[X_{n}^{2}\right]\) \(<\infty\) for all \(n .\) Show that \(S_{n} / b_{n} \rightarrow 0\) with probability one for any monotonic real sequence \(b_{1} \leq \cdots \leq b_{n} \leq b_{n+1} \uparrow \infty\), provided \(\sum_{n=1}^{\infty} E\left[X_{n}^{2}\right] / b_{n}^{2}<\infty\).
Let \(\xi_{n}\) be nonnegative random variables satisfying $$ E\left[\xi_{n+1} \mid \xi_{1}, \ldots, \xi_{n}\right] \leq \delta_{n}+\xi_{n} $$ where \(\delta_{n} \geq 0\) are constants and \(\Delta=\sum_{n=1}^{\infty} \delta_{n}<\infty .\) Show that with probability one, \(\xi_{n}\) converges to a finite random variable \(\xi\) as \(n \rightarrow \infty\).
Let \(\left\\{X_{n}\right\\}\) be a martingale satisfying \(E\left[X_{n}^{2}\right] \leq K<\infty\) for all \(n\). Suppose $$ \lim _{n \rightarrow \infty} \sup _{m \sum 1}\left|E\left[X_{n} X_{n+m}\right]-E\left[X_{n}\right] E\left[X_{n+m}\right]\right|=0 $$ Show that \(X=\lim _{n \rightarrow \infty} X_{n}\) is a constant, i.e., nonrandom.
Suppose \(Y\) is \(\mathscr{B}\)-measurable, and \(E[|Y|]<\infty\). Show that \(E[Y Z] \geq 0\) for all bounded nonnegative \(\mathscr{B}\)-measurable random variables \(Z\) implies \(P[\\{\omega: Y(\omega) \geq 0\\}]=1\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.