Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(U\) have the uniform distribution on the interval\((0,1)\). Show that the random variable \(W\)defined in Eq. (12.4.6) has the p.d.f. \(h\)defined in Eq. (12.4.5).

Short Answer

Expert verified

This is true because the p.d.f. of random variable with uniform distribution on \((0,1)\) is equal to\(l\).

Step by step solution

01

Definition for importance sampling

  • Many integrals can be advantageously recast as random variable functions.
  • We can estimate integrals that would otherwise be impossible to compute in closed form if we can simulate a large number of random variables with proper distributions.
02

Determine the inverse function and its derivative 

Random variable \(W\)is defined as

\(W = {\mu _2} + {\sigma _2}{\Phi ^{ - 1}}\left( {U\Phi \left( {\frac{{{c_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right)\)And function \(h\) is defined as

\(h\left( {{x_2}} \right) = \frac{{{{\left( {2\pi \sigma _2^2} \right)}^{ - 1/2}}\exp \left( {{{\left( {{x_2} - {\mu _2}} \right)}^2}/\left( {2\sigma _2^2} \right)} \right)}}{{\Phi \left( {\left( {{c_2} - {\mu _2}} \right)/{\sigma _2}} \right)}},\;\;\; - \infty < {x_2} \le {c_2}\)

From

\(w = {\mu _2} + {\sigma _2}{\Phi ^{ - 1}}\left( {u\Phi \left( {\frac{{{c_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right)\)

It follows that

\(u = \frac{{\Phi \left( {\left( {w - {\mu _2}} \right)/{\sigma _2}} \right)}}{{\Phi \left( {\left( {{c_2} - {\mu _2}} \right)/{\sigma _2}} \right)}}\)Is the inverse transformation.

The derivative of \(\Phi \)is the p.d.f. of a standard normal distribution, hence, the derivative of the inverse function is

\(\frac{\partial }{{\partial w}}u = \frac{1}{{\Phi \left( {\left( {{c_2} - {\mu _2}} \right)/{\sigma _2}} \right)}} \cdot {\left( {2\pi \sigma _2^2} \right)^{ - 1/2}}\exp \left( {\frac{{{{\left( {{x_2} - {\mu _2}} \right)}^2}}}{{2\sigma _2^2}}} \right) = h\left( {{x_2}} \right)\)

Which is p.d.f. ofW.

This is true because the p.d.f. of random variable with uniform distribution on \((0,1)\) is equal to \(1.\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \({\bf{f}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\) be a joint p.d.f. Suppose that \(\left( {{{\bf{x}}_{{\bf{1}}\,}}^{\left( {\bf{i}} \right)}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}^{\left( {\bf{i}} \right)}} \right)\)has the joint p.d.f. Let \(\left( {{{\bf{x}}_{{\bf{1}}\,}}^{\left( {{\bf{i + 1}}} \right)}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}^{\left( {{\bf{i + 1}}} \right)}} \right)\)be the result of applying steps \(2\,\,and\,\,3\) of the Gibbs sampling algorithm on-page \({\bf{824}}\). Prove that \(\left( {{{\bf{x}}_{{\bf{1}}\,}}^{\left( {{\bf{i + 1}}} \right)}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}^{\left( {\bf{i}} \right)}} \right)\) and \(\left( {{{\bf{x}}_{{\bf{1}}\,}}^{\left( {{\bf{i + 1}}} \right)}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}^{\left( {{\bf{i + 1}}} \right)}} \right)\)also have the joint p.d.f. f.

Test the t pseudo-random number generator on your computer. Simulate 10,000 t pseudo-random variables with m degrees of freedom for m=1,2,5,10,20. Then draw t quantile plots

Assume that one can simulate as many \({\bf{i}}.{\bf{i}}.{\bf{d}}.\)exponential random variables with parameters\({\bf{1}}\) as one wishes. Explain how one could use simulation to approximate the mean of the exponential distribution with parameters\({\bf{1}}\).

Use the data on fish prices in Table 11.6 on page 707. Suppose that we assume only that the distribution of fish prices in 1970 and 1980 is a continuous joint distribution with finite variances. We are interested in the properties of the sample correlation coefficient. Construct 1000 nonparametric bootstrap samples for solving this exercise.

a. Approximate the bootstrap estimate of the variance of the sample correlation.

b. Approximate the bootstrap estimate of the bias of the sample correlation.

c. Compute simulation standard errors of each of the above bootstrap estimates.

Use the data in Table 10.6 on page 640. We are interested in the bias of the sample median as an estimator of the median of the distribution.

a. Use the non-parametric bootstrap to estimate this bias.

b. How many bootstrap samples does it appear that you need in order to estimate the bias to within .05 with a probability of 0.99?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free