Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Question:Suppose that a random variable X has a normal distribution for which the mean μ is unknown (−∞ <μ< ∞) and the variance σ2 is known. Let\({\bf{f}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)\)denote the p.d.f. of X, and let\({\bf{f'}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)\)and\({\bf{f''}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)\)denote the first and second partial derivatives with respect to μ. Show that

\(\int_{{\bf{ - }}\infty }^\infty {{\bf{f'}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)} {\bf{dx = 0}}\,\,{\bf{and}}\,\,\int_{{\bf{ - }}\infty }^\infty {{\bf{f''}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)} {\bf{dx = 0}}{\bf{.}}\).

Short Answer

Expert verified

Proved.

Step by step solution

01

Given information

X has a normal distribution with the mean of\(\mu \)and the variance of\({\sigma ^2}\).

\(f\left( {x\left| \mu \right.} \right)\)denote the p.d.f. of X

\(f'\left( {x\left| \mu \right.} \right)\) and \(f''\left( {x\left| \mu \right.} \right)\) denote the first and second-order partial derivatives with respect to \(\mu \)

02

Proving part

\(f\left( {x\left| \mu \right.} \right) = \frac{1}{{\sqrt {2\pi } \sigma }}\exp \left\{ { - \frac{1}{{2{\sigma ^2}}}{{\left( {x - \mu } \right)}^2}} \right\}\)

So,

The first-order derivative is,

\(\begin{array}{c}\frac{\partial }{{\partial \mu }}f\left( {x\left| \mu \right.} \right) = \frac{\partial }{{\partial \mu }}\left( {\frac{1}{{\sqrt {2\pi } \sigma }}\exp \left\{ { - \frac{1}{{2{\sigma ^2}}}{{\left( {x - \mu } \right)}^2}} \right\}} \right)\\ = \frac{1}{{\sqrt {2\pi } \sigma }}\frac{{\left( {x - \mu } \right)}}{{{\sigma ^2}}}\exp \left\{ { - \frac{1}{{2{\sigma ^2}}}{{\left( {x - \mu } \right)}^2}} \right\}\\ = \frac{{\left( {x - \mu } \right)}}{{{\sigma ^2}}}f\left( {x\left| \mu \right.} \right)\end{array}\)

The second-order derivative is,

\(\begin{array}{c}\frac{\partial }{{\partial \mu }}f'\left( {x\left| \mu \right.} \right) = \frac{\partial }{{\partial \mu }}\left( {\frac{{\left( {x - \mu } \right)}}{{{\sigma ^2}}}f\left( {x\left| \mu \right.} \right)} \right)\\ = \left[ {\frac{{{{\left( {x - \mu } \right)}^2}}}{{{\sigma ^4}}} - \frac{1}{{{\sigma ^2}}}} \right]f\left( {x\left| \mu \right.} \right)\end{array}\)

Therefore,

\(\begin{array}{c}\int_{ - \infty }^\infty {f'\left( {x\left| \mu \right.} \right)} dx = \frac{1}{{{\sigma ^2}}}\int_{ - \infty }^\infty {\left( {x - \mu } \right)} f\left( {x\left| \mu \right.} \right)d\mu \\ = \frac{1}{{{\sigma ^2}}}E\left( {X - \mu } \right)\\ = \frac{1}{{{\sigma ^2}}}\left( {\mu - \mu } \right)\\ = 0\end{array}\)

Hence, proved.

Again,

\(\begin{array}{c}\int_{ - \infty }^\infty {f''\left( {x\left| \mu \right.} \right)} dx = \frac{1}{{{\sigma ^4}}}E\left[ {{{\left( {X - \mu } \right)}^2}} \right] - \frac{1}{{{\sigma ^2}}}\\ = \frac{{{\sigma ^4}}}{{{\sigma ^2}}} - \frac{1}{{{\sigma ^2}}}\\ = \frac{1}{{{\sigma ^2}}} - \frac{1}{{{\sigma ^2}}}\\ = 0\end{array}\)

Hence, proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question:Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{, }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}\) form n Bernoulli trials for which the parameter p is unknown (0≤p≤1). Show that the expectation of every function \({\bf{\delta }}\left( {{{\bf{X}}_{\bf{1}}}{\bf{, }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}} \right)\)is a polynomial in p whose degree does not exceed n.

Suppose that X1,…….,Xn form a random sample from the Bernoulli distribution with unknown parameter p. Show that the variance of every unbiased estimator of (1-p)2 must be at least 4p(1-p)3/n.

Suppose that a point(X, Y, Z)is to be chosen at random in three-dimensional space, whereX,Y, andZare independent random variables, and each has the standard normal distribution. What is the probability that the distance from the origin to the point will be less than 1 unit?

Question:Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from a distribution for which the p.d.f. or the p.f. is f (x|θ ), where the value of the parameter θ is unknown. Let\({\bf{X = }}\left( {{{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}} \right)\)and let T be a statistic. Assuming that δ(X) is an unbiased estimator of θ, it does not depend on θ. (If T is a sufficient statistic defined in Sec. 7.7, then this will be true for every estimator δ. The condition also holds in other examples.) Let\({{\bf{\delta }}_{\bf{0}}}\left( {\bf{T}} \right)\)denote the conditional mean of δ(X) given T.

a. Show that\({{\bf{\delta }}_{\bf{0}}}\left( {\bf{T}} \right)\)is also an unbiased estimator of θ.

b. Show that\({\bf{Va}}{{\bf{r}}_{\bf{\theta }}}\left( {{{\bf{\delta }}_{\bf{0}}}} \right) \le {\bf{Va}}{{\bf{r}}_{\bf{\theta }}}\left( {\bf{\delta }} \right)\)for every possible value of θ. Hint: Use the result of Exercise 11 in Sec. 4.7.

Question:Consider an infinite sequence of Bernoulli trials for which the parameter p is unknown (0 <p< 1), and suppose that sampling is continued until exactly k successes have been obtained, where k is a fixed integer (k ≥ 2). Let N denote the total number of trials that are needed to obtain the k successes. Show that the estimator (k − 1)/(N − 1) is an unbiased estimator of p.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free