Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \({{\bf{z}}_{\scriptstyle{\bf{1}}\atop\scriptstyle\,}}{\bf{,}}{{\bf{z}}_{\scriptstyle{\bf{2}}\atop\scriptstyle\,}}....\) from a Markov chain, and assume that distribution of \({z_{\scriptstyle1\atop\scriptstyle\,}}\)is the stationary distribution. Show that the joint distribution \(\left( {{{\bf{z}}_{\scriptstyle{\bf{1}}\atop\scriptstyle\,}}{\bf{,}}{{\bf{z}}_{\scriptstyle{\bf{2}}\atop\scriptstyle\,}}} \right)\)of is the same as the joint distribution of \(\left( {{{\bf{z}}_{\scriptstyle{\bf{i}}\atop\scriptstyle\,}}{\bf{,}}{{\bf{z}}_{\scriptstyle{\bf{i + 1}}\atop\scriptstyle\,}}} \right)\) for all\(i > 1\) convenience, you may assume that the Markov chain has finite state space, but the result holds in general.

Short Answer

Expert verified

From a Markov chain, assume that distribution of is the stationary distribution.

The joint probability mass function \(\,\left( {{z_1}\,\,,{z_2}} \right)\)

\({g_{1,2}}\left( {{z_1}\,\,,{z_2}} \right) = g\left( {{z_1}} \right)h\left( {{z_2}\,\,\mid {z_1}} \right)\,\,\)

Use that Z1 has the stationary distribution.

Step by step solution

01

Definition of the stationary distribution

A stationary distribution is a specific entity that remains unaffected by the effect of a matrix or operator.

The distribution of \({z_1}\,\) is the stationary distribution. By proving that \({z_i}\,\,\) it has the stationary distribution for every i, it follows that \(\,\left( {{z_1}\,\,,{z_2}} \right)\) have the same distribution as \(\left( {{z_1}\,\,,{z_{i + 1}}} \right)\)

The joint probability mass function \(\,\left( {{z_1}\,\,,{z_2}} \right)\)

\({g_{1,2}}\left( {{z_1}\,\,,{z_2}} \right) = g\left( {{z_1}} \right)h\left( {{z_2}\,\,\mid {z_1}} \right)\,\,\)

02

Proven by simple induction

where g is p.f. or p.d.f. of\({z_1}\,\)

And his conditional p.f. or p.d.f. given that\(\,{Z_1}\, = {z_1}\)

which is true of the fact that \({z_1}\,\)it has stationary distribution.

Also, it follows that\({z_i}\,\) a stationary distribution for all i can be proven by simple induction when for\(n = 1\,\) \({Z_1}\,\)is stationary is the base. Next, because we now\({z_i}\,\) have stationary distribution, it follows that

\({g_{i,i + 1}}\left( {{z_i}\,\,,{z_{i + 1}}} \right) = g\left( {{z_i}} \right)h\left( {{z_{i + 1}}\,\,{z_i}\,} \right) = {g_{1,2}}\left( {{z_i}\,\,,{z_{i + 1}}} \right)\)

For arbitrary i, which was to be seen.

Hence,

Use that Z1 has the stationary distribution.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Use the data in Table 10.6 on page 640. We are interested in the bias of the sample median as an estimator of the median of the distribution.

a. Use the non-parametric bootstrap to estimate this bias.

b. How many bootstrap samples does it appear that you need in order to estimate the bias to within .05 with a probability of 0.99?

\({{\bf{x}}_{\scriptstyle{\bf{1}}\atop\scriptstyle\,}}.....{{\bf{x}}_{\scriptstyle{\bf{n}}\atop\scriptstyle\,}}\) be uncorrelated, each with variance \({\sigma ^2}\) Let \({{\bf{y}}_{\scriptstyle{\bf{1}}\atop\scriptstyle\,}}.....{{\bf{y}}_{\scriptstyle{\bf{n}}\atop\scriptstyle\,}}\) be positively correlated. each with variance, prove that the variance of \(\overline x \)is smaller than the variance of \(\overline y \)

Let \(U\) have the uniform distribution on the interval\((0,1)\). Show that the random variable \(W\)defined in Eq. (12.4.6) has the p.d.f. \(h\)defined in Eq. (12.4.5).

Use the data consisting of 30 lactic acid concentrations in cheese,10 from example 8.5.4 and 20 from Exercise 16 in sec.8,6, Fit the same model used in Example 8.6.2 with the same prior distribution, but this time use the Gibbs sampling algorithm in Example 12.5.1. simulate 10,000 pairs of \(\left( {{\bf{\mu ,\tau }}} \right)\) parameters. Estimate the posterior mean of \({\left( {\sqrt {{\bf{\tau \mu }}} } \right)^{ - {\bf{1}}}}\), and compute the standard simulation error of the estimator.

Let \(U\) have the uniform distribution on the interval\([0,1]\). Show that the random variable \(W\)defined in Eq. (12.4.6) has the p.d.f. \(h\)defined in Eq. (12.4.5).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free