Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Problem 1

Suppose that \(S_{n} \stackrel{P}{\longrightarrow} s_{0}\), and that the function \(h\) is continuous at \(s_{0}\), that is, for any \(\varepsilon>0\) there exists a \(\delta>0\) such that \(|x-y|<\delta\) implies that \(|h(x)-h(y)|<\varepsilon .\) Explain why this implies that $$ \operatorname{Pr}\left(\left|S_{n}-s_{0}\right|<\delta\right) \leq \operatorname{Pr}\left\\{\left|h\left(S_{n}\right)-h\left(s_{0}\right)\right|<\varepsilon\right\\} \leq 1 $$ and deduce that \(\operatorname{Pr}\left\\{\left|h\left(s_{0}\right)-h\left(S_{n}\right)\right|<\varepsilon\right\\} \rightarrow 1\) as \(n \rightarrow \infty\). That is, \(h\left(S_{n}\right) \stackrel{P}{\longrightarrow} h\left(s_{0}\right)\).

Problem 2

Suppose that conditional on \(\mu, X\) and \(Y\) are independent Poisson variables with means \(\mu\), but that \(\mu\) is a realization of random variable with density \(\lambda^{v} \mu^{\nu-1} e^{-\lambda \mu} / \Gamma(v), \mu>0\) \(v, \lambda>0 .\) Show that the joint moment-generating function of \(X\) and \(Y\) is $$ \mathrm{E}\left(e^{s X+t Y}\right)=\lambda^{v}\left\\{\lambda-\left(e^{s}-1\right)-\left(e^{t}-1\right)\right\\}^{-v} $$ and hence find the mean and covariance matrix of \((X, Y) .\) What happens if \(\lambda=v / \xi\) and \(v \rightarrow \infty ?\)

Problem 2

Let \(s_{0}\) be a constant. By writing $$ \operatorname{Pr}\left(\left|S_{n}-s_{0}\right| \leq \varepsilon\right)=\operatorname{Pr}\left(S_{n} \leq s_{0}+\varepsilon\right)-\operatorname{Pr}\left(S_{n} \leq s_{0}-\varepsilon\right) $$ for \(\varepsilon>0\), show that \(S_{n} \stackrel{D}{\longrightarrow} s_{0}\) implies that \(S_{n} \stackrel{P}{\longrightarrow} s_{0}\).

Problem 3

Show that a binomial random variable \(R\) with denominator \(m\) and probability \(\pi\) has cumulant-generating function \(K(t)=m \log \left(1-\pi+\pi e^{t}\right)\). Find \(\lim K(t)\) as \(m \rightarrow \infty\) and \(\pi \rightarrow 0\) in such a way that \(m \pi \rightarrow \lambda=0\). Show that $$ \operatorname{Pr}(R=r) \rightarrow \frac{\lambda^{r}}{r !} e^{-\lambda} $$ and hence establish that \(R\) converges in distribution to a Poisson random variable. This yields the Poisson approximation to the binomial distribution, sometimes called the law of small numbers. For a numerical check in the S language, try \(\mathrm{y}<-0: 10 ;\) lambda \(<-1 ; \mathrm{m}<-10 ; \mathrm{p}<-\) lambda/m round(cbind(y, pbinom \((\mathrm{y}\), size \(=\mathrm{m}, \mathrm{prob}=\mathrm{p})\), ppois \((\mathrm{y}\), lambda) \()\), digits \(=3\) ) with various other values of \(m\) and \(\lambda\).

Problem 3

The Cauchy density (2.16) has no moment-generating function, but its characteristic function is \(\mathrm{E}\left(e^{i t Y}\right)=\exp (i t \theta-|t|)\), where \(i^{2}=-1\). Show that the average \(\bar{Y}\) of a random sample \(Y_{1}, \ldots, Y_{n}\) of such variables has the same characteristic function as \(Y_{1}\). What does this imply?

Problem 3

(a) Let \(X\) and \(Y\) be two random variables with finite positive variances. Use the fact that \(\operatorname{var}(a X+Y) \geq 0\), with equality if and only if the linear combination \(a X+Y\) is constant with probability one, to show that \(\operatorname{cov}(X, Y)^{2} \leq \operatorname{var}(X) \operatorname{var}(Y)\); this is a version of the Cauchy-Schwarz inequality. Hence show that \(-1 \leq \operatorname{corr}(X, Y) \leq 1\), and say under what conditions equality is attained. (b) Show that if \(X\) and \(Y\) are independent, \(\operatorname{corr}(X, Y)=0\). Show that the converse is false by considering the variables \(X\) and \(Y=X^{2}-1\), where \(X\) has mean zero, variance one, and \(\mathrm{E}\left(X^{3}\right)=0\)

Problem 3

Let the \(X_{1}, \ldots, X_{n}\) be independent exponential variables with rates \(\lambda_{j}\). Show that \(Y=\) \(\min \left(X_{1}, \ldots, X_{n}\right)\) is also exponential, with rate \(\lambda_{1}+\cdots+\lambda_{n}\), and that \(\operatorname{Pr}\left(Y=X_{j}\right)=\) \(\lambda_{j} /\left(\lambda_{1}+\cdots+\lambda_{n}\right)\)

Problem 4

(a) Let \(X\) be the number of trials up to and including the first success in a a sequence of independent Bernoulli trials having success probability \(\pi\). Show that \(\operatorname{Pr}(X=k)=\) \(\pi(1-\pi)^{k-1}, k=1,2, \ldots\), and deduce that \(X\) has moment-generating function \(\pi e^{t} /\\{1-\) \(\left.(1-\pi) e^{t}\right\\}\); hence find its mean and variance. \(X\) has the geometric distribution. (b) Now let \(Y_{n}\) be the number of trials up to and including the \(n\)th success in such a sequence of trials. Show that $$ \operatorname{Pr}\left(Y_{n}=k\right)=\left(\begin{array}{l} k-1 \\ n-1 \end{array}\right) \pi^{n}(1-\pi)^{k-n}, \quad k=n, n+1, \ldots $$ this is the negative binomial distribution. Find the mean and variance of \(Y_{n}\), and show that as \(n \rightarrow \infty\) the sequence \(\left\\{Y_{n}\right\\}\) satisfies the conditions of the Central Limit Theorem. Deduce that $$ \lim _{n \rightarrow \infty} 2^{1-n} \sum_{k=0}^{n}\left(\begin{array}{c} k+n-1 \\ n-1 \end{array}\right) \frac{1}{2^{k}}=1 $$ (c) Find the limiting cumulant-generating function of \(\pi Y_{n} /(1-\pi)\) as \(\pi \rightarrow 0\), and hence show that the limiting distribution is gamma.

Problem 4

Let \(X_{1}, \ldots, X_{n}\) and \(Y_{1}, \ldots, Y_{n}\) be independent random samples from the exponential densities \(\lambda e^{-\lambda x}, x>0\), and \(\lambda^{-1} e^{-y / \lambda}, y>0\), with \(\lambda>0\). If \(\bar{X}\) and \(\bar{Y}\) are the sample averages, show that \(\bar{X} \bar{Y} \stackrel{P}{\longrightarrow} 1\) as \(n \rightarrow \infty\).

Problem 5

Let \(Y_{1}, \ldots, Y_{n}\) be a random sample from a distribution with mean \(\mu\) and variance \(\sigma^{2}\). Find the mean of $$ T=\frac{1}{2 n(n-1)} \sum_{j \neq k}\left(Y_{j}-Y_{k}\right)^{2} $$ and by writing \(Y_{j}-Y_{k}=Y_{j}-\bar{Y}-\left(Y_{k}-\bar{Y}\right)\), show that \(T=S^{2}\).

Access millions of textbook solutions in one place

  • Access over 3 million high quality textbook solutions
  • Access our popular flashcard, quiz, mock-exam and notes features
  • Access our smart AI features to upgrade your learning
Get Vaia Premium now
Access millions of textbook solutions in one place

Recommended explanations on Math Textbooks