Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Problem 1

Let \(Y\) have a binomial distribution in which \(n=20\) and \(p=\theta\). The prior probabilities on \(\theta\) are \(P(\theta=0.3)=2 / 3\) and \(P(\theta=0.5)=1 / 3\). If \(y=9\), what are the posterior probabilities for \(\theta=0.3\) and \(\theta=0.5\) ?

Problem 1

Suppose \(Y\) has a \(\Gamma(1,1)\) distribution while \(X\) given \(Y\) has the conditional pdf $$ f(x \mid y)=\left\\{\begin{array}{ll} e^{-(x-y)} & 0

Problem 1

Consider the Bayes model $$ \begin{aligned} X_{i} \mid \theta & \sim \operatorname{iid} \Gamma\left(1, \frac{1}{\theta}\right) \\ \Theta \mid \beta & \sim \Gamma(2, \beta) \end{aligned} $$ By performing the following steps, obtain the empirical Bayes estimate of \(\theta\). (a) Obtain the likelihood function $$ m(\mathbf{x} \mid \beta)=\int_{0}^{\infty} f(\mathbf{x} \mid \theta) h(\theta \mid \beta) d \theta $$ (b) Obtain the mle \(\widehat{\beta}\) of \(\beta\) for the likelihood \(m(\mathbf{x} \mid \beta)\). (c) Show that the posterior distribution of \(\Theta\) given \(\mathbf{x}\) and \(\widehat{\beta}\) is a gamma distribution. (d) Assuming squared-error loss, obtain the empirical Bayes estimator.

Problem 1

Let \(X_{1}, X_{2}\) be a random sample from a Cauchy distribution with pdf $$ f\left(x ; \theta_{1}, \theta_{2}\right)=\left(\frac{1}{\pi}\right) \frac{\theta_{2}}{\theta_{2}^{2}+\left(x-\theta_{1}\right)^{2}}, \quad-\infty

Problem 2

Consider the hierarchical Bayes model $$ \begin{aligned} Y & \sim b(n, p), \quad 00 \\ \theta & \sim \Gamma(1, a), \quad a>0 \text { is specified. } \end{aligned} $$ (a) Assuming squared-error loss, write the Bayes estimate of \(p\) as in expression (11.4.3). Integrate relative to \(\theta\) first. Show that both the numerator and denominator are expectations of a beta distribution with parameters \(y+1\) and \(n-y+1\). (b) Recall the discussion around expression (11.3.2). Write an explicit Monte Carlo algorithm to obtain the Bayes estimate in part (a).

Problem 2

Let \(X_{1}, X_{2}, \ldots, X_{10}\) be a random sample of size \(n=10\) from a gamma distribution with \(\alpha=3\) and \(\beta=1 / \theta\). Suppose we believe that \(\theta\) has a gamma distribution with \(\alpha=10\) and \(\beta=2\). (a) Find the posterior distribution of \(\theta\). (b) If the observed \(\bar{x}=18.2\), what is the Bayes point estimate associated with square-error loss function? (c) What is the Bayes point estimate using the mode of the posterior distribution? (d) Comment on an HDR interval estimate for \(\theta\). Would it be easier to find one having equal tail probabilities? Hint: Can the posterior distribution be related to a chi-square distribution?

Problem 3

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a distribution that is \(N\left(\theta, \sigma^{2}\right)\), where \(-\infty<\theta<\infty\) and \(\sigma^{2}\) is a given positive number. Let \(Y=\bar{X}\) denote the mean of the random sample. Take the loss function to be \(\mathcal{L}[\theta, \delta(y)]=|\theta-\delta(y)|\). If \(\theta\) is an observed value of the random variable \(\Theta\) that is \(N\left(\mu, \tau^{2}\right)\), where \(\tau^{2}>0\) and \(\mu\) are known numbers, find the Bayes solution \(\delta(y)\) for a point estimate \(\theta\).

Problem 4

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with mean \(\theta, 0<\theta<\infty\). Let \(Y=\sum_{1}^{n} X_{i} .\) Use the loss function \(\mathcal{L}[\theta, \delta(y)]=\) \([\theta-\delta(y)]^{2}\). Let \(\theta\) be an observed value of the random variable \(\Theta\). If \(\Theta\) has the prior \(\operatorname{pdf} h(\theta)=\theta^{\alpha-1} e^{-\theta / \beta} / \Gamma(\alpha) \beta^{\alpha}\), for \(0<\theta<\infty\), zero elsewhere, where \(\alpha>0, \beta>0\) are known numbers, find the Bayes solution \(\delta(y)\) for a point estimate for \(\theta\).

Problem 4

Let \(f(x \mid \theta), \theta \in \Omega\), be a pdf with Fisher information, \((6.2 .4), I(\theta)\). Consider the Bayes model $$ \begin{aligned} X \mid \theta & \sim f(x \mid \theta), \quad \theta \in \Omega \\ \Theta & \sim h(\theta) \propto \sqrt{I(\theta)} \end{aligned} $$ (a) Suppose we are interested in a parameter \(\tau=u(\theta)\). Use the chain rule to prove that $$ \sqrt{I(\tau)}=\sqrt{I(\theta)}\left|\frac{\partial \theta}{\partial \tau}\right| . $$ (b) Show that for the Bayes model (11.2.2), the prior pdf for \(\tau\) is proportional to \(\sqrt{I(\tau)}\) The class of priors given by expression (11.2.2) is often called the class of Jeffreys' priors; see Jeffreys (1961). This exercise shows that Jeffreys' priors exhibit an invariance in that the prior of a parameter \(\tau\), which is a function of \(\theta\), is also proportional to the square root of the information for \(\tau\).

Problem 5

Consider the Bayes model \(X_{i} \mid \theta, i=1,2, \ldots, n \sim\) iid with distribution \(\Gamma(1, \theta), \theta>0\) $$ \Theta \sim h(\theta) \propto \frac{1}{\theta} $$ (a) Show that \(h(\theta)\) is in the class of Jeffreys' priors. (b) Show that the posterior pdf is $$ h(\theta \mid y) \propto\left(\frac{1}{\theta}\right)^{n+2-1} e^{-y / \theta}, $$ where \(y=\sum_{i=1}^{n} x_{i}\) (c) Show that if \(\tau=\theta^{-1}\), then the posterior \(k(\tau \mid y)\) is the pdf of a \(\Gamma(n, 1 / y)\) distribution. (d) Determine the posterior pdf of \(2 y \tau\). Use it to obtain a \((1-\alpha) 100 \%\) credible interval for \(\theta\). (e) Use the posterior pdf in part (d) to determine a Bayesian test for the hypotheses \(H_{0}: \theta \geq \theta_{0}\) versus \(H_{1}: \theta<\theta_{0}\), where \(\theta_{0}\) is specified.

Access millions of textbook solutions in one place

  • Access over 3 million high quality textbook solutions
  • Access our popular flashcard, quiz, mock-exam and notes features
  • Access our smart AI features to upgrade your learning
Get Vaia Premium now
Access millions of textbook solutions in one place

Recommended explanations on Math Textbooks