Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \({{\bf{X}}_{\bf{1}}},...,{{\bf{X}}_n}\) be i.i.d. with the normal distribution having mean \(\mu \) and precision \(\tau \).Gibbs sampling allows one to use a prior distribution for \(\left( {\mu ,\tau } \right)\) in which \(\mu \) and\(\tau \) are independent. With mean \({\mu _0}\) and variance, \({\gamma _0}\) Let the prior distribution of \(\tau \)being the gamma distribution with parameters \({\alpha _0}\) and \({\beta _0}\) .

a. Show that the Table \({\bf{12}}{\bf{.8}}\) specifies the appropriate conditional distribution for each parameter given the other.

b. Use the new Mexico nursing home data(Examples \({\bf{12}}{\bf{.5}}{\bf{.2}}\,{\bf{and}}\,{\bf{12}}{\bf{.5}}{\bf{.3}}\) ). Let the prior hyperparameters be \({{\bf{\alpha }}_{\bf{0}}}{\bf{ = 2,}}{{\bf{\beta }}_{\bf{0}}}{\bf{ = 6300,}}{{\bf{\mu }}_{\bf{0}}}{\bf{ = 200}}\), and \({\gamma _0} = 6.35 \times {10^{ - 4}}.\) Implement a Gibbs sampler to find the posterior distribution \(\left( {\mu ,\tau } \right).\,\) . In particular, calculate an interval containing \(95\) percent of the posterior distribution of \(\mu \)

Short Answer

Expert verified

Gibbs sampling allows one to use a prior distribution forthe gamma distribution with parameters.

\(\begin{aligned}{l}{\rm{ }}{\tau ^{n/2}}exp\,\left\{ { - \frac{\tau }{2}\left( {n{{\left( {\overline {{x_n}} - \mu } \right)}^2} + {s_n}^2} \right)} \right\}\exp \left\{ { - \frac{{{\gamma _0}}}{2}{{\left( {\mu - {\mu _0}} \right)}^2}} \right\}\\ \times \,\,\,{\tau ^{\alpha - 1}}{\rm{ }}exp{\rm{ }}\left\{ { - \frac{1}{2}\tau {\beta _0}} \right\}.\,\,\,\,\,\,\,(1)\end{aligned}\)

((a) ) Confirm using the product of a constant, prior distribution, and the likelihood

((b)) \((154.1,{\rm{ }}216.4).{\rm{ }}\)

Step by step solution

01

(a) Definition of Mean and precision

Precision is typically expressed as the deviation of a set of results from the set's arithmetic mean.

Assume that the given sample comes from a normal distribution with mean μ and precision\(\tau \). A function of interest to get the desired result is a function proportional to the product of the prior probability density function and the likelihood, that is, proportional to

\(\begin{aligned}{l}{\rm{ }}{\tau ^{n/2}}exp\,\left\{ { - \frac{\tau }{2}\left( {n{{\left( {\overline {{x_n}} - \mu } \right)}^2} + {s_n}^2} \right)} \right\}\exp \left\{ { - \frac{{{\gamma _0}}}{2}{{\left( {\mu - {\mu _0}} \right)}^2}} \right\}\\ \times \,\,\,{\tau ^{\alpha - 1}}{\rm{ }}exp{\rm{ }}\left\{ { - \frac{1}{2}\tau {\beta _0}} \right\}.\,\,\,\,\,\,\,(1)\end{aligned}\)

Here, the precision of the prior distribution of the parameter of interest is defined as usual.

As a function, when the other parameters are fixed\((1)\), \(\tau \)it is proportional to a gamma probability density function with parameters

\({\alpha _0} + \frac{n}{2}\,\,and\,\,{\beta _0}\, + 0.5 \mp {\sum\nolimits_{i = 1}^n {\left( {{x_i} - \overline x } \right)} ^2} + 0.5n{\left( {\overline x - \mu } \right)^2}\)

Function \((1)\), by creating a square in a normal distribution, as a function of \(\mu \)when the other parameters are fixed, is proportional to a normal probability density function with

\(\begin{aligned}{l}mean = \frac{{{\gamma _0}{\mu _0} + n\tau \overline x }}{{{\gamma _0}n\tau }},\\precision = {\gamma _0} + n\tau .\end{aligned}\)

02

(b) Gibbs sampling

Gibbs Samplingis a Monte Carlo Markov Chain method for estimating complex joint distributions that draw an instance from the distribution of each variable iteratively based on the current values of the other variables.

For this part, a simulation should be used. As suggested, use the Gibbs algorithm to obtain the interval of the posterior distribution of \(\mu \). The conditional distributions are given, so for the simulation, you need only initial values and a couple of Markov chains, with an appropriate number of burn-ins.

The Gibbs Sampling Algorithm: The steps of the algorithm are

\(\left( {1.} \right)\,\)Pick starting values \({x_2}^{\left( 0 \right)}\) for \(\,{x_2}\) , and let \(\,\,i = 0\,\)

\(\left( {2.} \right)\,\)let be a simulated value from the conditional distribution of \(\,{x_1}\)given that \(\,\,{X_1} = {x_2}^{\left( i \right)}\)

\(\left( {3.} \right)\,\)Let\(\,{x_2}^{\left( {i + 1} \right)\,\,}\,\) be a simulated value from the conditional distribution of \(\,{x_2}\) given that \(\,{X_1} = {x_1}^{\left( {i + 1} \right)}\)

\(\left( {4.} \right)\)Repeat steps \(\,2.\,\,and\,3.\) \(\,i\) where\(\,i + 1\)

Do \(\,N = 100000\) simulations (10 MaMarkov Chains, each of length 10000). That is N, values of \(\tau \) and \(\mu \) . The interest is the only parameter \(\mu \), and it's \(95\% \)an interval. From the\(\,N = 100000\) values of \(\mu \), one would need\(0.025 \times {N^{th}}\) sorted value and \(0.975 \times {N^{th}}\)sorted value. The interval in this case is

\(\left( {{\mu _{2500}}^{th}{\rm{, }}{\mu _{97500}}^{th}} \right).\)

After sorting, the values from one simulation obtained simulation is \((154.1,{\rm{ }}216.4).{\rm{ }}\)

Hence,

((a) ) Confirm using the product of a constant, prior distribution, and the likelihood

((b)) \((154.1,{\rm{ }}216.4).{\rm{ }}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Describe how to convert a random sample \({{\bf{U}}_{\bf{1}}}{\bf{, \ldots ,}}{{\bf{U}}_{\bf{n}}}\) from the uniform distribution on the interval \({\bf{[0,1]}}\) to a random sample of size \({\bf{n}}\) from the uniform distribution on the interval\({\bf{[a,b]}}\).

If \({\bf{X}}\) has the Cauchy distribution, the mean \({\bf{X}}\)does not exist. What would you expect to happen if you simulated a large number of Cauchy random variables and computed their average?

Test the t pseudo-random number generator on your computer. Simulate 10,000 t pseudo-random variables with m degrees of freedom for m=1,2,5,10,20. Then draw t quantile plots

Let \({x_1},...,{x_n}\) be the observed values of a random sample \(X = \left( {{x_1},...,{x_n}} \right)\) . Let \({F_n}\)be the sample c.d.f. Let \(\,{j_1},...\,,{j_n}\) be a random sample with replacement from the numbers \(\left\{ {1,.....,n} \right\}\) Define\({x_i}^ * = x{j_i}\) for \(i = 1,..,n.\)ashow that \({x^ * } = \left( {{x_1}^ * ,...,{x_n}^ * } \right)\) is an i.i.d. sample from the distribution\({F_n}\)

Let \({\bf{f}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right){\bf{ = cg}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\) be a joint p.d.f for \(\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right){\bf{,}}\)each \({x_{2\,}}\)let\({{\bf{h}}_{{\bf{2}}\,}}\left( {{{\bf{x}}_{{\bf{1}}\,}}} \right){\bf{ = g}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\) that is what we get by considering \({\bf{g}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\) as a function of \({{\bf{x}}_{{\bf{1}}\,}}\)for fixed \({{\bf{x}}_{2\,}}\)show that there is a multiplicative factor \({{\bf{c}}_{{\bf{2}}\,}}\)that does not depend on such that is the conditional p.d.f of \({{\bf{x}}_{{\bf{1}}\,}}\) given \(\left( {{{\bf{x}}_{{\bf{2}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free