Problem 1
Two balls are drawn successively without replacement from an urn containing three white and two red balls. Are the outcomes of the first and second draws independent? Are they exchangeable?
Problem 2
Consider estimating the success probability \(\theta\) for a binomial variable \(R\) with denominator \(m\), using a beta prior distribution with parameters \(a, b>0\). (a) Show that the marginal probability \(\operatorname{Pr}(R=r \mid \mu, v)\) has beta-binomial form $$ \frac{\Gamma(v)}{\Gamma(v \mu) \Gamma\\{v(1-\mu)\\}}\left(\begin{array}{c} m \\ r \end{array}\right) \frac{\Gamma(r+v \mu) \Gamma\\{m-r+v(1-\mu)\\}}{\Gamma(m+v)}, \quad r=0, \ldots, m $$ where \(\mu=a /(a+b)\) and \(v=a+b\), and deduce that $$ \mathrm{E}(R / m)=\mu, \quad \operatorname{var}(R / m)=\frac{\mu(1-\mu)}{m}\left(1+\frac{m-1}{v+1}\right) $$ (b) Show that methods of moments estimators based on a random sample \(R_{1}, \ldots, R_{n}\) all with denominator \(m\) are $$ \widehat{\mu}=\bar{R}, \quad \widehat{v}=\frac{\widehat{\mu}(1-\widehat{\mu})-S^{2}}{S^{2}-\widehat{\mu}(1-\widehat{\mu}) / m} $$ where \(\bar{R}\) and \(S^{2}\) are the sample average and variance of the \(R_{j}\). (c) Find the mean and variance of the conditional distribution of \(\theta\) given \(R\), and show that the mean can be written as a shrinking of \(R / m\) towards \(\mu\). Hence give the empirical Bayes estimates of the \(\theta_{j}\).
Problem 2
Let \(Y_{1}, \ldots, Y_{n}\) be a random sample from the uniform distribution on \((0, \theta)\), and take as prior the Pareto density with parameters \(\beta\) and \(\lambda\), $$ \pi(\theta)=\beta \lambda^{\beta} \theta^{-\beta-1}, \quad \theta>\lambda, \quad \beta, \lambda>0 $$ (a) Find the prior distribution function and quantiles for \(\theta\), and hence give prior one- and two-sided credible intervals for \(\theta\). If \(\beta>1\), find the prior mean of \(\theta\). (b) Show that the posterior density of \(\theta\) is Pareto with parameters \(n+\beta\) and \(\max \left\\{Y_{1}, \ldots, Y_{u}, \lambda\right\\}\), and hence give posterior credible intervals and the posterior mean for \(\theta\). (c) Interpret \(\lambda\) and \(\beta\) in terms of a prior sample from the uniform density.
Problem 3
(a) Let \(y_{1}, \ldots, y_{n}\) be a Poisson random sample with mean \(\theta\), and suppose that the prior density for \(\theta\) is gamma, $$ \pi(\theta)=g(\theta ; \alpha, \lambda)=\frac{\lambda^{\alpha} \theta^{\alpha-1}}{\Gamma(\alpha)} \exp (-\lambda \theta), \quad \theta>0, \lambda, \alpha>0 $$ Show that the posterior density of \(\theta\) is \(g\left(\theta ; \alpha+\sum y_{j}, \lambda+n\right)\), and find conditions under which the posterior density remains proper as \(\alpha \downarrow 0\) even though the prior density becomes improper in the limit. (b) Show that \(\int \theta g(\theta ; \alpha, \lambda) d \theta=\alpha / \lambda\). Find the prior and posterior means \(\mathrm{E}(\theta)\) and \(\mathrm{E}(\theta\) ) \(y\) ), and hence give an interpretation of the prior parameters. (c) Let \(Z\) be a new Poisson variable independent of \(Y_{1}, \ldots, Y_{n}\), also with mean \(\theta .\) Find its posterior predictive density. To what density does this converge as \(n \rightarrow \infty\) ? Does this make sense?
Problem 3
Show that if \(y_{1}, \ldots, y_{n}\) is a random sample from an exponential family with conjugate prior \(\pi(\theta \mid \lambda, m)\), any finite mixture of conjugate priors, $$ \sum_{j=1}^{k} p_{j} \pi\left(\theta, \lambda_{j}, m_{j}\right), \quad \sum_{j} p_{j}=1, p_{j} \geq 0 $$ is also conjugate. Check the details when \(y_{1}, \ldots, y_{n}\) is a random sample from the Bernoulli distribution with probability \(\theta\).
Problem 4
Two independent samples \(Y_{1}, \ldots, Y_{n} \stackrel{\text { iid }}{\sim} N\left(\mu, \sigma^{2}\right)\) and \(X_{1}, \ldots, X_{m} \stackrel{\text { iid }}{\sim} N\left(\mu, c \sigma^{2}\right)\) are available, where \(c>0\) is known. Find posterior densities for \(\mu\) and \(\sigma\) based on prior \(\pi(\mu, \sigma) \propto 1 / \sigma\).
Problem 4
Consider predicting the outcome of a future random variable \(Z\) on the basis of a random sample \(Y_{1}, \ldots, Y_{n}\) from density \(\lambda^{-1} e^{-u / \lambda}, u>0, \lambda>0\). Show that \(\pi(\lambda) \propto \lambda^{-1}\) gives posterior predictive density $$ f(z \mid y)=\frac{\int f(z, y \mid \lambda) \pi(\lambda) d \lambda}{\int f(y \mid \lambda) \pi(\lambda) d \lambda}=n s^{n} /(s+z)^{n+1}, \quad z>0 $$ where \(s=y_{1}+\cdots+y_{n}\) Show that when Laplace's method is applied to each integral in the predictive density the result is proportional to the exact answer, and assess how close the approximation is to a density when \(n=5\).
Problem 4
How would you express prior ignorance about an angle? About the position of a star in the firmament?
Problem 5
Let \(X_{1}, \ldots, X_{n}\) be a Poisson random sample with mean \(\mu\). Previous experience suggests prior density $$ \pi(\mu)=\frac{1}{\Gamma(v)} \mu^{v-1} e^{-\mu}, \quad 0<\mu<\infty, v>0 $$ If the loss function for an estimator \(\tilde{\mu}\) of \(\mu\) is \((\bar{\mu}-\mu)^{2}\), determine an estimator that minimizes the expected loss and compare its bias and variance with those of the maximum likelihood estimator.
Problem 6
According to the principle of insufficient reason probabilities should be ascribed uniformly to finite sets unless there is some definite reason to do otherwise. Thus the most natural way to express prior ignorance for a parameter \(\theta\) that inhabits a finite parameter space \(\theta_{1}, \ldots, \theta_{k}\) is to set \(\pi\left(\theta_{1}\right)=\cdots=\pi\left(\theta_{k}\right)=1 / k\). Let \(\pi_{i}=\pi\left(\theta_{i}\right)\). Consider a parameter space \(\left\\{\theta_{1}, \theta_{2}\right\\}\), where \(\theta_{1}\) denotes that there is life in orbit around the star Sirius and \(\theta_{2}\) that there is not. Can you see any reason not to take \(\pi_{1}=\pi_{2}=1 / 2 ?\) Now consider the parameter space \(\left[\omega_{1}, \omega_{2}, \omega_{3} \mid\right.\), where \(\omega_{1}, \omega_{2}\), and \(\omega_{3}\) denote the events that there is life around Sirius, that there are planets but no life, and that there are no planets. With this parameter space the principle of insufficient reason gives Pr(life around Sirius) \(=1 / 3\) Discuss this partitioning paradox. What solutions do you see? (Schafer. 1976. pp. 23-24)