Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Suppose that X and Y are independent random variables, X has the gamma distribution with parameters α1 and β, and Y has the gamma distribution with parameters α2 and β. Let U = X/ (X + Y) and V = X + Y. Show that (a) U has the beta distribution with parameters α1 and α2, and (b) U and V are independent.

Short Answer

Expert verified

(a) U has the beta distribution with parameters α1 and α2, and

(b) U and V are independent

Step by step solution

01

Given information

X and Y are independent random variables, X has the gamma distribution with parameters α1 and β, and Y has the gamma distribution with parameters α2 and β.

Let U = X/ (X + Y) and V = X + Y, we need to show that

(a) U has the beta distribution with parameters\({\alpha _1}\,and\,{\alpha _2}\), and

(b) U and V are independent

02

Step-2: Proof of part (a)

The marginal distribution probability density function for the random variables X and Y are

\(f\left(x\right) = \frac{{{\beta^{{\alpha _1}}}}}{{\left| \!{\overline {\,{\left( {{\alpha _1}} \right)} \,}} \right. }}{x^{{\alpha _1} - 1}}{e^{ - \beta x}}\)and\(f\left( y \right) = \frac{{{\beta ^{{\alpha _2}}}}}{{\left| \!{\overline {\,{\left({{\alpha_2}}\right)}\,}} \right. }}{y^{{\alpha _2} - 1}}{e^{ - \beta y}}\)

For x>0, y>0, the joint probability

\(\begin{aligned}{}f\left( {x,y} \right)& = \frac{{{\beta ^{{\alpha _1}}}}}{{\left| \!{\overline {\,{\left( {{\alpha _1}} \right)} \,}} \right. }}{x^{{\alpha _1} - 1}}{e^{ - \beta x}} \times \frac{{{\beta ^{{\alpha _2} - 1}}}}{{\left| \!{\overline {\, {\left( {{\alpha _2}} \right)} \,}} \right. }}{y^{{\alpha _2} - 1}}{e^{ - \beta y}}\\& = \frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\, {\left( {{\alpha _2}} \right)} \,}} \right. }}{x^{{\alpha _1} - 1}}{y^{{\alpha _2} 1}}{e^{- \beta \left( {x + y} \right)}}\end{aligned}\)

Let us consider,

\(\begin{aligned}{}V &= X + Y\\U &= \frac{X}{{X + Y}}\\ \Rightarrow U& = \frac{X}{V}\\X&= UV\\V &= X + Y \Rightarrow Y = V - UV = \left( {1 - V} \right)U\end{aligned}\)

Now, applying the Jacobian transformation

\(\left|J\right|=\left|{\left({\begin{array}{*{20}{c}}{\frac{{dx}}{{du}}}&{\frac{{dx}}{{dv}}}\\{\frac{{dy}}{{du}}}&{\frac{{dy}}{{dv}}}\end{array}} \right)} \right| = \left| {\left( {\begin{array}{*{20}{c}}v&u\\{ - v}&{1 - u}\end{array}} \right)} \right| = v\left( {1 - u} \right) + uv = v\)

Therefore, the joint pdf of U and V is the gamma distribution

\(\begin{aligned}{}f\left[ {uv,\left( {1 - u} \right)v} \right]\left| J \right| &= \frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\,{\left( {{\alpha _2}} \right)} \,}} \right. }}{\left( {uv} \right)^{{\alpha _1} - 1}}{\left( {\left( {1 - u} \right)v} \right)^{{\alpha _2} - 1}}{e^{ - \beta \left( {uv + \left( {1 - u} \right)v} \right)}}\left( v \right)\\&= \left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\, {{\alpha _2}} \,}} \right. {u^{{\alpha _1} - 1}}{\left( {1 - u} \right)^{{\alpha _2}1}}\frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1} + {\alpha _2}} \right)} \,}} \right. }}{v^{{\alpha _1} + {\alpha _2}- 1}}{e^{ - \beta v}}\end{aligned}\)

This probability density function is the product of beta distribution with parameters \({\alpha _1}\,and\,{\alpha _2}\)and the probability density function with parameters \({\alpha _1} + {\alpha _2}\,{\rm{and}}\,\beta \)

Thus, U has a beta distribution \({\alpha _1}\,and\,{\alpha _2}\).

03

Proof of part (b)

From part (a), we get

\(f\left[ {uv,\left( {1 - u} \right)v} \right]\left| J \right| = \left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\, {{\alpha _2}} \,}} \right. {u^{{\alpha _1} - 1}}{\left( {1 - u} \right)^{{\alpha _2} - 1}}\frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1} + {\alpha _2}} \right)} \,}} \right. }}{v^{{\alpha _1} + {\alpha _2}- 1}}{e^{ - \beta v}}\)

Where,

\(\begin{array}{l}f\left( u \right) = \frac{{\left| \!{\overline {\, {\left( {{\alpha _1} + {\alpha _2}} \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\, {\left( {{\alpha _2}} \right)} \,}} \right. }}{u^{{\alpha _1} - 1}}{\left( {1 - u} \right)^{{\alpha _2} - 1}}\,{\rm{and}}\\f\left( v \right) = \frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1} + {\alpha _2}} \right)} \,}} \right. }}{v^{{\alpha _1} + {\alpha _2} - 1}}{e^{ - \beta v}}\end{array}\)

The joint probability density function is expressed as the pdf of U and pdf of V. Thus, U and V are independent.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that n students are selected at random without replacement from a class containing T students, of whom A are boys and T – A are girls. Let X denote the number of boys that are obtained. For what sample size n will Var(X) be a maximum?

If the temperature in degrees Fahrenheit at a certain location is normally distributed with a mean of 68 degrees and a standard deviation of 4 degrees, what is the distribution of the temperature in degrees Celsius at the same location?

Suppose that the two measurements from flea beetles in Example 5.10.2 have the bivariate normal distribution with\({{\bf{\mu }}_{{\bf{1}}{\rm{ }}}} = {\rm{ }}{\bf{201}},{{\bf{\mu }}_{\bf{2}}}{\rm{ }} = {\rm{ }}{\bf{118}},{{\bf{\sigma }}_{\bf{1}}}{\rm{ }} = {\rm{ }}{\bf{15}}.{\bf{2}},{{\bf{\sigma }}_{\bf{2}}}{\rm{ }} = {\rm{ }}{\bf{6}}.{\bf{6}},{\rm{ }}{\bf{and}}\,\,{\bf{\rho }} = {\rm{ }}{\bf{0}}.{\bf{64}}.\)Suppose that the same two measurements from a secondspecies also have the bivariate normal distribution with\({{\bf{\mu }}_{\bf{1}}}{\rm{ }} = {\rm{ }}{\bf{187}},{{\bf{\mu }}_{\bf{2}}}{\rm{ }} = {\rm{ }}{\bf{131}},{{\bf{\sigma }}_{\bf{1}}}{\rm{ }} = {\rm{ }}{\bf{15}}.{\bf{2}},{{\bf{\sigma }}_{\bf{2}}}{\rm{ }} = {\rm{ }}{\bf{6}}.{\bf{6}},{\rm{ }}{\bf{and}}\,\,{\bf{\rho }} = {\rm{ }}{\bf{0}}.{\bf{64}}\). Let\(\left( {{{\bf{X}}_{\bf{1}}},{\rm{ }}{{\bf{X}}_{\bf{2}}}} \right)\) be a pair of measurements on a flea beetle from one of these two species. Let \({{\bf{a}}_{\bf{1}}},{\rm{ }}{{\bf{a}}_{\bf{2}}}\)be constants.

a. For each of the two species, find the mean and standard deviation of \({{\bf{a}}_{\bf{1}}}{{\bf{X}}_{\bf{1}}}{\rm{ }} + {\rm{ }}{{\bf{a}}_{\bf{2}}}{{\bf{X}}_{\bf{2}}}.\)(Note that the variances for the two species will be the same. How do you know that?)

b. Find \({{\bf{a}}_{\bf{1}}},{\rm{ }}{{\bf{a}}_{\bf{2}}}\) to maximize the ratio of the difference between the two means found in part (a) to the standarddeviation found in part (a). There is a sense inwhich this linear combination \({{\bf{a}}_{\bf{1}}}{{\bf{X}}_{\bf{1}}}{\rm{ }} + {\rm{ }}{{\bf{a}}_{\bf{2}}}{{\bf{X}}_{\bf{2}}}.\)does the best job of distinguishing the two species among allpossible linear combinations.

Suppose that X1 and X2 form a random sample of twoobserved values from the exponential distribution with parameter \({\bf{\beta }}\). Show that \(\frac{{{X_1}}}{{\left( {{X_1} + {X_2}} \right)}}\) has the uniform distribution on the interval [0, 1].

Prove Theorem 5.5.5.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free