Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Suppose that\(X\)has the beta distribution with parameters\(\alpha \)and\(\beta \), and let\(r\)and\(s\)be given positive integers. Determine the value of\(E\left[ {{X^r}{{\left( {1 - X} \right)}^s}} \right]\)

Short Answer

Expert verified

The value of \(E\left[ {{X^r}{{\left( {1 - X} \right)}^s}} \right]\)is \(\frac{{\left[ {\alpha \left( {\alpha + 1} \right)...\left( {\alpha + r - 1} \right)} \right]\left[ {\beta \left( {\beta + 1} \right)...\left( {\beta + s - 1} \right)} \right]}}{{\left( {\alpha + \beta } \right)\left( {\alpha + \beta + 1} \right)...\left( {\alpha + \beta + r + s - 1} \right)}}\)

Step by step solution

01

Given information

X be the random variable follows a beta distribution with parameter \(\alpha \) and \(\beta \)

02

Given information

Pdf of beta distribution with parameters\(\alpha ,\beta > 0\)is:

\(f\left( {x|\alpha ,\beta } \right) = \left\{ \begin{array}{l}\frac{{\left| \!{\overline {\, {\left( {\alpha + \beta } \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( \alpha \right)} \,}} \right. \left| \!{\overline {\, {\left( \beta \right)} \,}} \right. }}{x^{\alpha - 1}}{\left( {1 - x} \right)^{\beta - 1}}\;\;\;\;\;for\,0 < x < 1\\0\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;Otherwise\end{array} \right.\)

Therefore

\(E\left[ {{X^r}{{\left( {1 - X} \right)}^s}} \right] = \int\limits_0^1 {{X^r}{{\left( {1 - X} \right)}^s}\frac{{\left| \!{\overline {\, {\left( {\alpha + \beta } \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( \alpha \right)} \,}} \right. \left| \!{\overline {\, {\left( \beta \right)} \,}} \right. }}{x^{\alpha - 1}}{{\left( {1 - x} \right)}^{\beta - 1}}dx} \)\(\begin{aligned}{c}E\left[ {{X^r}{{\left( {1 - X} \right)}^s}} \right] &= \frac{{\left|\!{\overline {\, {\left( {\alpha + \beta } \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( \alpha \right)} \,}} \right. \left| \!{\overline {\, {\left( \beta \right)} \,}} \right. }}\int\limits_0^1 {{x^{\alpha + r - 1}}{{\left( {1 - x} \right)}^{\beta + s - 1}}dx} \\& = \frac{{\left| \!{\overline {\, {\left( {\alpha + \beta } \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( \alpha \right)} \,}} \right. \left| \!{\overline {\, {\left( \beta \right)} \,}} \right. }} \times \frac{{\left| \!{\overline {\, {\left( {\alpha + r} \right)} \,}} \right. \left| \!{\overline {\, {\left( {\beta + s} \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( {\alpha + \beta + r + s} \right)} \,}} \right. }}\end{aligned}\)

Therefore

\(E\left[ {{X^r}{{\left( {1 - X} \right)}^s}} \right] = \frac{{\left| \!{\overline {\, {\left( {\alpha + r} \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( \alpha \right)} \,}} \right. }} \times \frac{{\left| \!{\overline {\, {\left( {\beta + s} \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( \beta \right)} \,}} \right. }} \times \frac{{\left| \!{\overline {\, {\left( {\alpha + \beta } \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( {\alpha + \beta + r + s} \right)} \,}} \right. }}\)

\(E\left[ {{X^r}{{\left( {1 - X} \right)}^s}} \right] = \frac{{\left[ {\alpha \left( {\alpha + 1} \right)...\left( {\alpha + r - 1} \right)} \right]\left[ {\beta \left( {\beta + 1} \right)...\left( {\beta + s - 1} \right)} \right]}}{{\left( {\alpha + \beta } \right)\left( {\alpha + \beta + 1} \right)...\left( {\alpha + \beta + r + s - 1} \right)}}\)

The value of \(E\left[ {{X^r}{{\left( {1 - X} \right)}^s}} \right]\)is \(\frac{{\left[ {\alpha \left( {\alpha + 1} \right)...\left( {\alpha + r - 1} \right)} \right]\left[ {\beta \left( {\beta + 1} \right)...\left( {\beta + s - 1} \right)} \right]}}{{\left( {\alpha + \beta } \right)\left( {\alpha + \beta + 1} \right)...\left( {\alpha + \beta + r + s - 1} \right)}}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \({{\bf{X}}_{{\bf{1,}}}}{\bf{ }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}\)be i.i.d. random variables having the normal distribution with mean \({\bf{\mu }}\) and variance\({{\bf{\sigma }}^{\bf{2}}}\). Define\(\overline {{{\bf{X}}_{\bf{n}}}} {\bf{ = }}\frac{{\bf{1}}}{{\bf{n}}}\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {{{\bf{X}}_{\bf{i}}}} \) , the sample mean. In this problem, we shall find the conditional distribution of each \({{\bf{X}}_{\bf{i}}}\)given\(\overline {{{\bf{X}}_{\bf{n}}}} \).

a.Show that \({{\bf{X}}_{\bf{i}}}\)and\(\overline {{{\bf{X}}_{\bf{n}}}} \) have the bivariate normal distribution with both means \({\bf{\mu }}\), variances\({{\bf{\sigma }}^{\bf{2}}}{\rm{ }}{\bf{and}}\,\,\frac{{{{\bf{\sigma }}^{\bf{2}}}}}{{\bf{n}}}\),and correlation\(\frac{{\bf{1}}}{{\sqrt {\bf{n}} }}\).

Hint: Let\({\bf{Y = }}\sum\limits_{{\bf{j}} \ne {\bf{i}}} {{{\bf{X}}_{\bf{j}}}} \).

Now show that Y and \({{\bf{X}}_{\bf{i}}}\) are independent normal and \({{\bf{X}}_{\bf{n}}}\)and \({{\bf{X}}_{\bf{i}}}\) are linear combinations of Y and \({{\bf{X}}_{\bf{i}}}\) .

b.Show that the conditional distribution of \({{\bf{X}}_{\bf{i}}}\) given\(\overline {{{\bf{X}}_{\bf{n}}}} {\bf{ = }}\overline {{{\bf{x}}_{\bf{n}}}} \)\(\) is normal with mean \(\overline {{{\bf{x}}_{\bf{n}}}} \) and variance \({{\bf{\sigma }}^{\bf{2}}}\left( {{\bf{1 - }}\frac{{\bf{1}}}{{\bf{n}}}} \right)\).

Consider again the electronic system described in Exercise 10, but suppose now that the system will continue to operate until two components have failed. Determine the mean and the variance of the length of time until the system fails.

Suppose that X and Y are independent random variables, X has the gamma distribution with parameters α1 and β, and Y has the gamma distribution with parameters α2 and β. Let U = X/ (X + Y) and V = X + Y. Show that (a) U has the beta distribution with parameters α1 and α2, and (b) U and V are independent.

Review the derivation of the Black-Scholes formula (5.6.18). For this exercise, assume that our stock price at time u in the future is

\({{\bf{S}}_{\bf{0}}}{{\bf{e}}^{{\bf{\mu u + }}{{\bf{W}}_{\bf{u}}}}}\)where \({{\bf{W}}_{\bf{u}}}\) has the gamma distribution with parameters αu and β with β > 1. Let r be the risk-free interest rate.

a. Prove that \({{\bf{e}}^{{\bf{ - ru}}}}{\bf{E}}\left( {{{\bf{S}}_{\bf{u}}}} \right){\bf{ = }}{{\bf{S}}_{\bf{0}}}\,\,{\bf{if }}\,{\bf{and}}\,\,{\bf{only}}\,\,{\bf{if}}\,{\bf{\mu = r - \alpha log}}\left( {\frac{{\bf{\beta }}}{{{\bf{\beta - 1}}}}} \right)\)

b. Assume that \({\bf{\mu = r - \alpha log}}\left( {\frac{{\bf{\beta }}}{{{\bf{\beta - 1}}}}} \right)\). Let R be 1 minus the c.d.f. of the gamma distribution with parameters αu and 1. Prove that the risk-neutral price for the option to buy one share of the stock for the priceq at the time u is \(\begin{array}{l}{{\bf{S}}_{\bf{0}}}{\bf{R}}\left( {{\bf{c}}\left[ {{\bf{\beta - 1}}} \right]} \right){\bf{ - q}}{{\bf{e}}^{{\bf{ - ru}}}}{\bf{R}}\left( {{\bf{c\beta }}} \right)\,\,{\bf{where}}\\{\bf{c = log}}\left( {\frac{{\bf{q}}}{{{{\bf{S}}_{\bf{0}}}}}} \right){\bf{ + \alpha ulog}}\left( {\frac{{\bf{\beta }}}{{{\bf{\beta - 1}}}}} \right){\bf{ - ru}}\end{array}\)

c. Find the price for the option being considered when u = 1, q = \({{\bf{S}}_{\bf{0}}}\), r = 0.06, α = 1, and β = 10.

Prove Corollary 5.9.2.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free