Chapter 11: Problem 2
. In the proof of \(11.1 .1\), we considered the case in which
\(p_{3}
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Chapter 11: Problem 2
. In the proof of \(11.1 .1\), we considered the case in which
\(p_{3}
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for free. Consider the Bayes model $$ \begin{aligned} X_{i} \mid \theta & \sim \operatorname{iid} \Gamma\left(1, \frac{1}{\theta}\right) \\ \Theta \mid \beta & \sim \Gamma(2, \beta) \end{aligned} $$ By performing the following steps obtain the empirical Bayes estimate of \(\theta\). (a) Obtain the likelihood function $$ m(\mathbf{x} \mid \beta)=\int_{0}^{\infty} f(\mathbf{x} \mid \theta) h(\theta \mid \beta) d \theta $$ (b) Obtain the mle \(\widehat{\beta}\) of \(\beta\) for the likelihood \(m(\mathbf{x} \mid \beta)\). (c) Show that the posterior distribution of \(\Theta\) given \(\mathbf{x}\) and \(\widehat{\beta}\) is a gamma distribution.
Consider the Bayes model \(X_{i} \mid \theta, i=1,2, \ldots n \sim\) iid with distribution \(\Gamma(1, \theta), \theta>0\) $$ \Theta \sim h(\theta) \propto \frac{1}{\theta} $$ (a) Show that \(h(\theta)\) is in the class of Jeffrys priors. (b) Show that the posterior pdf is $$ h(\theta \mid y) \propto\left(\frac{1}{\theta}\right)^{n+2-1} e^{-y / \theta} $$ where \(y=\sum_{i=1}^{n} x_{i}\) (c) Show that if \(\tau=\theta^{-1}\) then the posterior \(k(\tau \mid y)\) is the pdf of a \(\Gamma(n, 1 / y)\) distribution. (d) Determine the the posterior pdf of \(2 y \tau\). Use it to obtain a \((1-\alpha) 100 \%\) credible interval for \(\theta\). (e) Use the posterior pdf in Part (d) to determine a Bayesian test for the hypotheses \(H_{0}: \theta \geq \theta_{0}\) versus \(H_{1}: \theta<\theta_{0}\), where \(\theta_{0}\) is specified.
Let \(X_{1}, X_{2}, \ldots, X_{10}\) be a random sample of size \(n=10\) from a gamma distribution with \(\alpha=3\) and \(\beta=1 / \theta\). Suppose we believe that \(\theta\) has a gamma distribution with \(\alpha=10\) and \(\beta=2\) (a) Find the posterior distribution of \(\theta\). (b) If the observed \(\bar{x}=18.2\), what is the Bayes point estimate associated with square error loss function. (c) What is the Bayes point estimate using the mode of the posterior distribution? (d) Comment on an HDR interval estimate for \(\theta\). Would it be easier to find one having equal tail probabilities? Hint: Can the posterior distribution be related to a chi-square distribution?
Consider the Bayes model \(X_{i} \mid \theta, i=1,2, \ldots n \sim\) iid with distribution Poisson \((\theta), \theta>0\) $$ \Theta \sim h(\theta) \propto \theta^{-1 / 2} $$ (a) Show that \(h(\theta)\) is in the class of Jeffrys priors. (b) Show that the posterior pdf of \(2 n \theta\) is the pdf of a \(\chi^{2}(2 y+1)\) distribution, where \(y=\sum_{i=1}^{n} x_{i}\) (c) Use the posterior pdf of Part (b) to obtain a \((1-\alpha) 100 \%\) credible interval for \(\theta\) (d) Use the posterior pdf in Part (d) to determine a Bayesian test for the hypotheses \(H_{0}: \theta \geq \theta_{0}\) versus \(H_{1}: \theta<\theta_{0}\), where \(\theta_{0}\) is specified.
In Example 11.2.2 let \(n=30, \alpha=10\), and \(\beta=5\) so that \(\delta(y)=(10+y) / 45\) is the Bayes' estimate of \(\theta\). (a) If \(Y\) has a binomial distribution \(b(30, \theta)\), compute the risk \(E\left\\{[\theta-\delta(Y)]^{2}\right\\}\). (b) Find values of \(\theta\) for which the risk of Part (a) is less than \(\theta(1-\theta) / 30\), the risk associated with the maximum likelihood estimator \(Y / n\) of \(\theta\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.