Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \({x_1},...,{x_n}\) be the observed values of a random sample \(X = \left( {{x_1},...,{x_n}} \right)\) . Let \({F_n}\)be the sample c.d.f. Let \(\,{j_1},...\,,{j_n}\) be a random sample with replacement from the numbers \(\left\{ {1,.....,n} \right\}\) Define\({x_i}^ * = x{j_i}\) for \(i = 1,..,n.\)ashow that \({x^ * } = \left( {{x_1}^ * ,...,{x_n}^ * } \right)\) is an i.i.d. sample from the distribution\({F_n}\)

Short Answer

Expert verified

The observed values of a random sample.

\(\begin{aligned}{l}pr\left( {{x_1}^ * = {x_{i1}},{x_2}^ * = {x_{i2}},..,{x^ * }_n = {x_{in}}} \right)\\ = {\prod ^n}_{j = 1}\,pr\left( {{x_j}^ * = {x_{ij}}} \right)\end{aligned}\)

\(Pr\left( {{x_1}^ * = {x_{i1}},{x_2}^ * = {x_{i2}},..,{x^ * }_n = {x_{in}}} \right) = {\prod ^n}_{j = 1}\,pr\left( {{x_j}^ * = {x_{ij}}} \right)\)

Step by step solution

01

Definition of a random variable is a variable  

A random variableis a variable with an unknown value or a function that assigns values to each of the outcomes of an experiment.

To prove that the random sample \({{\rm{x}}^ * }\)is i.i.d.

\(\begin{aligned}{l}pr\left( {{x_1}^ * = {x_{i1}},{x_2}^ * = {x_{i2}},..,{x^ * }_n = {x_{in}}} \right)\\ = {\prod ^n}_{j = 1}\,pr\left( {{x_j}^ * = {x_{ij}}} \right)\end{aligned}\)

Let \({i_1},{i_2},..,{i_n}\, \in \left\{ {1,2,...,n} \right\}\) and \({x_i},i = 1,2,...,n\) be a sample from the distribution with c.d.f. \({F_n}\,\). A random variable \({x_i}\,\)takes value \({x_{ij}}\) when the random variable \({j_i}\) takes value \({i_j}.\,{j_1},{j_{2,}},...,{j_n}\) . is a random sample,

02

Random variables are independent

Random variables are independent. Next, directly it follows that

\(pr\left( {{x_1}^ * = {x_{i1}},{x_2}^ * = {x_{i2}},..,{x^ * }_n = {x_{in}}} \right)\)

\( = \Pr \left( {{j_1} = {i_1},{j_2} = {i_2},...,{j_n} = {i_n}} \right)\)

\(\Pr = \left( {{j_1} = {i_1}} \right)\Pr = \left( {{j_2} = {i_2}} \right)...\Pr = \left( {{j_n} = {i_n}} \right)\)

\( = {\prod ^n}_{j = 1}\,pr\left( {{x_j}^ * = {x_{ij}}} \right)\)

Hence,

\(Pr\left( {{x_1}^ * = {x_{i1}},{x_2}^ * = {x_{i2}},..,{x^ * }_n = {x_{in}}} \right) = {\prod ^n}_{j = 1}\,pr\left( {{x_j}^ * = {x_{ij}}} \right)\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

If \({\bf{X}}\)has the \({\bf{p}}.{\bf{d}}.{\bf{f}}.\)\({\bf{1/}}{{\bf{x}}^{\bf{2}}}\)for\({\bf{x > 1}}\), the mean of \({\bf{X}}\) is infinite. What would you expect to happen if you simulated a large number of random variables with this \({\bf{p}}.{\bf{d}}.{\bf{f}}.\) and computed their average?

Let \({\bf{f}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right){\bf{ = cg}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\) be a joint p.d.f for \(\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right){\bf{,}}\)each \({x_{2\,}}\)let\({{\bf{h}}_{{\bf{2}}\,}}\left( {{{\bf{x}}_{{\bf{1}}\,}}} \right){\bf{ = g}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\) that is what we get by considering \({\bf{g}}\left( {{{\bf{x}}_{{\bf{1}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\) as a function of \({{\bf{x}}_{{\bf{1}}\,}}\)for fixed \({{\bf{x}}_{2\,}}\)show that there is a multiplicative factor \({{\bf{c}}_{{\bf{2}}\,}}\)that does not depend on such that is the conditional p.d.f of \({{\bf{x}}_{{\bf{1}}\,}}\) given \(\left( {{{\bf{x}}_{{\bf{2}}\,}}{\bf{,}}{{\bf{x}}_{{\bf{2}}\,}}} \right)\)

Test the gamma pseudo-random number generator on your computer. Simulate 10,000 gamma pseudo-random variables with parameters a and 1 for \(a = 0.5,1,1.5,2,5,\) 10. Then draw gamma quantile plots

Use the data in table 11.19 on page 762.This time fits the model developed in Example 12.5.6.use the prior hyperparameters \(\,{{\bf{\lambda }}_{\scriptstyle{\bf{0}}\atop\scriptstyle\,}}{\bf{ = }}{{\bf{\alpha }}_{\scriptstyle{\bf{0}}\atop\scriptstyle\,}}\,{\bf{ = 1,}}\,\,{{\bf{\beta }}_{\scriptstyle{\bf{0}}\atop\scriptstyle\,}}{\bf{ = 0}}{\bf{.1}},{{\bf{\mu }}_{\scriptstyle{\bf{0}}\atop\scriptstyle\,}}{\bf{ = 0}}{\bf{.001}}\)and \({{\bf{\psi }}_{\scriptstyle{\bf{0}}\atop\scriptstyle\,}}{\bf{ = 800}}\)obtain a sample of 10,000 from the posterior joint distribution of the parameters. Estimate the posterior mean of the three parameters \({{\bf{\mu }}_{\scriptstyle{\bf{1}}\atop\scriptstyle\,}}{\bf{,}}{{\bf{\mu }}_{\scriptstyle{\bf{2}}\atop\scriptstyle\,}}{\bf{,}}{{\bf{\mu }}_{\scriptstyle{\bf{3}}\atop\scriptstyle\,}}\)

Consider the power calculation done in Example 9.5.5.

a. Simulate \({v_0} = 1000\) i.i.d. noncentral t pseudo-random variables with 14 degrees of freedom and noncentrality parameter \(1.936.\)

b. Estimate the probability that a noncentral t random variable with 14 degrees of freedom and noncentrality parameter \(1.936\) is at least \(1.761.\) Also, compute the standard simulation error.

c. Suppose that we want our estimator of the noncentral t probability in part (b) to be closer than \(0.01\) the true value with probability \(0.99.\) How many noncentral t random variables do we need to simulate?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free