Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

The method of antithetic variates is a technique for reducing the variance of simulation estimators. Antithetic variates are negatively correlated random variables with an expected mean and variance. The variance of the average of two antithetic variates is smaller than the variance of the average of two i.i.d. variables. In this exercise, we shall see how to use antithetic variates for importance sampling, but the method is very general. Suppose that we wish to compute \(\smallint \,g\left( x \right)\,\,dx\), and we wish to use the importance function f. Suppose that we generate pseudo-random variables with the p.d.f. f using the integral probability transformation. For \(\,{\bf{i = 1,2,}}...{\bf{,\nu ,}}\,\)let \({{\bf{X}}^{\left( {\bf{i}} \right)}}{\bf{ = }}{{\bf{F}}^{{\bf{ - 1}}}}\left( {{\bf{1 - }}{{\bf{U}}^{\left( {\bf{i}} \right)}}} \right)\), where \({{\bf{U}}^{\left( {\bf{i}} \right)}}\)has the uniform distribution on the interval (0, 1) and F is the c.d.f. Corresponding to the p.d.f. f . For each \(\,{\bf{i = 1,2,}}...{\bf{,\nu ,}}\,\) define

\(\begin{aligned}{l}{{\bf{T}}^{\left( {\bf{i}} \right)}}{\bf{ = }}{{\bf{F}}^{ - {\bf{1}}}}\left( {{\bf{1}} - {{\bf{U}}^{\left( {\bf{i}} \right)}}} \right)\,\,{\bf{.}}\\{{\bf{W}}^{\left( {\bf{i}} \right)}}{\bf{ = }}\frac{{{\bf{g}}\left( {{{\bf{X}}^{\left( {\bf{i}} \right)}}} \right)}}{{{\bf{f}}\left( {{{\bf{X}}^{\left( {\bf{i}} \right)}}} \right)}}\\{{\bf{V}}^{\left( {\bf{i}} \right)}}{\bf{ = }}\frac{{{\bf{g}}\left( {{{\bf{T}}^{\left( {\bf{i}} \right)}}} \right)}}{{{\bf{f}}\left( {{{\bf{T}}^{\left( {\bf{i}} \right)}}} \right)}}\\{{\bf{Y}}^{\left( {\bf{i}} \right)}}{\bf{ = 0}}{\bf{.5}}\left( {{{\bf{W}}^{\left( {\bf{i}} \right)}}{\bf{ + k}}{{\bf{V}}^{\left( {\bf{i}} \right)}}} \right){\bf{.}}\end{aligned}\)

Our estimator of\(\smallint \,{\bf{g}}\left( {\bf{x}} \right)\,\,{\bf{dx}}\)is then\({\bf{Z = }}\frac{{\bf{I}}}{{\bf{\nu }}}\sum\nolimits_{{\bf{i = 1}}}^{\bf{\nu }} {{{\bf{Y}}^{\left( {\bf{i}} \right)}}{\bf{.}}} \).

a. Prove that\({T^{\left( i \right)}}\)has the same distribution as\({X^{\left( i \right)}}\).

b. Prove that\({\bf{E}}\left( {\bf{Z}} \right){\bf{ = }}\smallint \,\,{\bf{g}}\left( {\bf{x}} \right)\,\,{\bf{dx}}\).

c. If\({\bf{g}}\left( {\bf{x}} \right)\,{\bf{/f}}\left( {\bf{x}} \right)\)it is a monotone function, explain why we expect it \({{\bf{V}}^{\left( {\bf{i}} \right)}}\)to be negatively correlated.

d. If \({{\bf{W}}^{\left( {\bf{i}} \right)}}\) and \({{\bf{V}}^{\left( {\bf{i}} \right)}}\)are negatively correlated, show that Var(Z) is less than the variance one would get with 2v simulations without antithetic variates.

Short Answer

Expert verified
  1. \({F^{ - 1}}\)is a monotone increasing function
  2. Use equality of random variables\({X^{\left( i \right)}}\)and\({T^{\left( i \right)}}\)

\(\,E\left( Z \right)\,\,\,\,\,\mathop = \limits^{\left( 1 \right)} \frac{I}{2}\,\,.\,\,2E\left( {{W^{\left( i \right)}}} \right) = \smallint \,g\left( x \right)\,\,dx\)

  1. See how\({X^{\left( i \right)}}\)and\({T^{\left( i \right)}}\)behave
  2. Compare two variances.

\( - 1 \le \rho \le 1\)

Step by step solution

01

(a) Uniform distribution

This follows directly from the fact that\({U^{\left( i \right)}}\)and\(1 - {U^{\left( i \right)}}\)are from the uniform distribution on (0, 1).

\({F^{ - 1}}\) is a monotone increasing function

02

(b) Find \({\bf{E}}\left( {\bf{Z}} \right)\) the value

It follows from the importance sampling method as

\(\begin{aligned}{}E\left( Z \right) = E\left( {\frac{I}{\nu }\sum\limits_{I = 1}^\nu {{Y^{\left( i \right)}}} } \right) = 0.5\frac{I}{\nu }\,\,.\,\,\nu E\left( {{W^{\left( i \right)}}} \right) + 0.5\frac{I}{\nu }\,\,.\,\,\nu E\left( {{V^{\left( i \right)}}} \right)\\\,\,\,\,\,\,\,\,\,\,\,\,\,\mathop = \limits^{\left( 1 \right)} \frac{I}{2}\,\,.\,\,2E\left( {{W^{\left( i \right)}}} \right) = \smallint \,g\left( x \right)\,\,dx\end{aligned}\)

(1): from (a) \({X^{\left( i \right)}}\) and \({T^{\left( i \right)}}\) have the same distribution, which implies that \({W^{\left( i \right)}}\) and \({V^{\left( i \right)}}\) have the same distribution.

03

(c) Definition of \({{\bf{X}}^{\left( {\bf{i}} \right)}}\) and \({{\bf{T}}^{\left( {\bf{i}} \right)}}\)

From the definition of\({X^{\left( i \right)}}\)and\({T^{\left( i \right)}}\), because\({F^{ - 1}}\)is a monotone increasing function, when\({X^{\left( i \right)}}\)increases\({T^{\left( i \right)}}\)decrease and opposite.

By the assumption that \(g\left( x \right)/f\left( x \right)\) is a monotone function, the relation between \({W^{\left( i \right)}}\) and \({V^{\left( i \right)}}\) would be the same as the relation between \({X^{\left( i \right)}}\) , which means that when one increases, the other decreases and the opposite, which indicates that the random variables should be negatively correlated.

04

(d) Find \({\bf{Var}}\left( {\bf{Z}} \right)\) the value

Compare the following two variances. The first variance is

\(\begin{aligned}{}Var\left( Z \right) = Var\left( {\frac{I}{\nu }\sum\limits_{I = 1}^\nu {{Y^{\left( i \right)}}} } \right) = \frac{I}{{{\nu ^2}}}\,\,.\,\nu {0.5^2}\,Var\left( {{W^{\left( i \right)}}\,\, + \,{V^{\left( i \right)}}} \right)\\\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, = 0.25\,\,.\,\,\left( {Var\,\,\left( {{W^{\left( i \right)}}} \right) + Var\,\left( {\,{V^{\left( i \right)}}} \right) + \,\,2\rho } \right)\\\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, = \frac{I}{\nu }\,\,.\,\,0.5\,\left( {1 - \rho } \right)\,\,Var\,\left( {{W^{\left( i \right)}}} \right)\end{aligned}\)

The variances are equal (the distribution is the same). The second variance to compare is

\(Var\,\,\left( {\frac{I}{{2\nu }}\sum\limits_{I = 1}^{2\nu } {{W^{\left( i \right)}}} } \right) = \frac{I}{{2\nu }}\,Var\,\,\left( {{W^{\left( i \right)}}} \right).\)

The following is true

\(\frac{I}{\nu }\,\,.\,\,0.5\,\left( {1 - \rho } \right)\,\,Var\,\left( {{W^{\left( i \right)}}} \right) < \frac{I}{{2\nu }}\,Var\,\,\left( {{W^{\left( i \right)}}} \right)\)

if and only if

\(\left( {1 - \rho } \right)\,\, < 1\)

This is true for (remember that which is confirmed by assuming they are negatively correlated.

Here, the result of part (a), part (b), part (c) and part (d)

  1. \({F^{ - 1}}\)is a monotone increasing function
  2. Use equality of random variables\({X^{\left( i \right)}}\)and\({T^{\left( i \right)}}\)
  3. See how\({X^{\left( i \right)}}\)and\({T^{\left( i \right)}}\)behave
  4. Compare two variances.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Use the data in the Table \({\bf{11}}{\bf{.5}}\) on page \({\bf{699}}\) suppose that \({{\bf{y}}_{\bf{i}}}\) is the logarithm of pressure \({x_i}\)and is the boiling point for the I the observation \({\bf{i = 1,}}...{\bf{,17}}{\bf{.}}\) Use the robust regression scheme described in Exercise \({\bf{8}}\) to \({\bf{a = 5, b = 0}}{\bf{.1}}\,\,{\bf{and f = 0}}{\bf{.1}}{\bf{.}}\) Estimate the posterior means and standard deviations of the parameter \({{\bf{\beta }}_{\bf{0}}}{\bf{,}}{{\bf{\beta }}_{\bf{1}}}\,\) and n.

Use the blood pressure data in Table 9.2 that was described in Exercise 10 of Sec. 9.6. Suppose now that we are not confident that the variances are the same for the two treatment groups. Perform a parametric bootstrap analysis of the sort done in Example 12.6.10. Use v=10,000 bootstrap simulations.

a. Estimate the probability of type I error for a two-sample t-test whose nominal level is \({\alpha _0} = 0.1.\)

b. Correct the level of the two-sample t-test by computing the appropriate quantile of the bootstrap distribution of \(\left| {{U^{(i)}}} \right|.\)

c. Compute the standard simulation error for the quantile in part (b).

Show how to simulate Cauchy random variables using the probability integral transformation.

Suppose that \({x_1},...,{x_n}\) from a random sample from an exponential distribution with parameter\(\theta \).Explain how to use the parametric bootstrap to estimate the variance of the sample average\(\overline X \).(No simulation is required.)

Let \(U\) have the uniform distribution on the interval\((0,1)\). Show that the random variable \(W\)defined in Eq. (12.4.6) has the p.d.f. \(h\)defined in Eq. (12.4.5).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free