Chapter 6: Problem 31
Let \(X, X_{1}, X_{2}, \ldots\) be independent identically distributed random varialhi'm having negative mean \(\mu\) and finite variance \(\sigma^{2}\). With \(S_{0}=0\) and \(S_{n}=X_{1}\) ? \(\cdots+X_{n}\), set \(M=\max _{n \geq 0} S_{n} .\) In view of \(\mu<0\), we know that \(M<\infty .\) Assumi \(E[M]<\infty\). (In fact, it can be shown that this is a consequence of \(\sigma^{2}<\infty\).) Define \(r(x)=x^{+}=\max \\{x, 0\\}\) and \(f(x)=E\left[(x+M-E[M])^{+}\right]\) (a) Show \(f(x) \geq r(x)\) for all \(x\). (b) Show \(f(x) \geq E[f(x+X)]\) for all \(x\), so that \(\left\\{f\left(x+S_{n}\right)\right\\}\) is a nonnegative supermartingale [Hint: Verify and use the fact that \(M\) and \((X+M)^{+}\)have the same distribution.] (c) Use (a) and (b) to show \(f(x) \geq E\left[\left(x+S_{T}\right)^{+}\right]\)for all Markov times \(T .\left[\left(x+S_{\infty}\right)^{+}=\lim _{n \rightarrow \infty}\left(X+S_{n}\right)^{+}=0 .\right]\)
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.