Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let\({{\rm{X}}_{\rm{1}}}{\rm{,}}{{\rm{X}}_{\rm{2}}}{\rm{, \ldots ,}}{{\rm{X}}_{\rm{n}}}\)be a random sample from a pdf\({\rm{f(x)}}\)that is symmetric about\({\rm{\mu }}\), so that\({\rm{\backslash widetildeX}}\)is an unbiased estimator of\({\rm{\mu }}\). If\({\rm{n}}\)is large, it can be shown that\({\rm{V (\tilde X)\gg 1/}}\left( {{\rm{4n(f(\mu )}}{{\rm{)}}^{\rm{2}}}} \right)\).

a. Compare\({\rm{V(\backslash widetildeX)}}\)to\({\rm{V(\bar X)}}\)when the underlying distribution is normal.

b. When the underlying pdf is Cauchy (see Example 6.7),\({\rm{V(\bar X) = \yen}}\), so\({\rm{\bar X}}\)is a terrible estimator. What is\({\rm{V(\tilde X)}}\)in this case when\({\rm{n}}\)is large?

Short Answer

Expert verified

a) The comparison is\({\rm{V(\tilde X) > V(\bar X)}}\)

b) The variance now becomes\({\rm{V(\tilde X) = }}\frac{{{{\rm{\pi }}^{\rm{2}}}{{\rm{\beta }}^{\rm{2}}}}}{{{\rm{4n}}}}{\rm{.}}\)

Step by step solution

01

Introduction

An estimator is a rule for computing an estimate of a given quantity based on observable data: the rule (estimator), the quantity of interest (estimate), and the output (estimate) are all distinct.

02

Explanation

The pdf \({\rm{f(x)}}\)of normally distributed random variable with parameters \({\rm{\mu }}\)and \({\rm{\sigma }}\) is

\({\rm{f(x) = }}\frac{{\rm{1}}}{{\sqrt {{\rm{2\pi }}} {\rm{ \times \sigma }}}}{\rm{exp}}\left\{ {{\rm{ - }}\frac{{{{{\rm{(x - \mu )}}}^{\rm{2}}}}}{{{\rm{2}}{{\rm{\sigma }}^{\rm{2}}}}}} \right\}{\rm{,}}\quad {\rm{x\hat I R}}{\rm{.}}\)

As given in the exercise, it can be shown that

\({\rm{V(\tilde X)\gg }}\frac{{\rm{1}}}{{{\rm{4n \times (f(\mu )}}{{\rm{)}}^{\rm{2}}}}}\)

where \({\rm{f(\mu )}}\)is

\(\begin{aligned}f(\mu ) = \frac{{\rm{1}}}{{\sqrt {{\rm{2\pi }}} {\rm{ \times \sigma }}}}{\rm{exp}}\left\{ {{\rm{ - }}\frac{{{{{\rm{(\mu - \mu )}}}^{\rm{2}}}}}{{{\rm{2}}{{\rm{\sigma }}^{\rm{2}}}}}} \right\}\\ &= \frac{{\rm{1}}}{{\sqrt {{\rm{2\pi }}} {\rm{ \times \sigma }}}}{\rm{.}}\end{aligned}\)

Then, the variance is

\({\rm{V(\tilde X)\gg }}\frac{{\rm{1}}}{{{\rm{4n \times (f(\mu )}}{{\rm{)}}^{\rm{2}}}}}{\rm{ = }}\frac{{{\rm{2\pi }}{{\rm{\sigma }}^{\rm{2}}}}}{{{\rm{4n}}}}\)

Also, the variance of random variable \({\rm{\bar X}}\)is

\(\begin{aligned} V(\bar X) &= V \left( {\frac{{\rm{1}}}{{\rm{n}}}\sum\limits_{{\rm{i = 1}}}^{\rm{n}} {{{\rm{X}}_{\rm{i}}}} } \right)\\ &= \frac{{\rm{1}}}{{{{\rm{n}}^{\rm{2}}}}}\sum\limits_{{\rm{i = 1}}}^{\rm{n}} {\rm{V}} \left( {{{\rm{X}}_{\rm{i}}}} \right)\\ &= \frac{{\rm{1}}}{{{{\rm{n}}^{\rm{2}}}}}{\rm{ \times n \times V}}\left( {{{\rm{X}}_{\rm{1}}}} \right)\\ &= \frac{{{{\rm{\sigma }}^{\rm{2}}}}}{{\rm{n}}}\end{aligned}\)

(1): all \({{\rm{X}}_{\rm{i}}}\)have the same distribution and are independent.

It is obvious that

\({\rm{V(\tilde X)\gg }}\frac{{\rm{\pi }}}{{\rm{2}}}{\rm{ \times }}\frac{{{{\rm{\sigma }}^{\rm{2}}}}}{{\rm{n}}}{\rm{ > }}\frac{{{{\rm{\sigma }}^{\rm{2}}}}}{{\rm{n}}}{\rm{ = V(\bar X)}}\)

because

\(\frac{{\rm{\pi }}}{{\rm{2}}}{\rm{ > 1}}\)

Therefore, the comparison is\({\rm{V(\tilde X) > V(\bar X)}}\)

03

Explanation

The pdf \({\rm{f(x)}}\)of Cauchy distributed random variable with parameter \({\rm{\beta }}\)is

\({\rm{f(x) = }}\frac{{\rm{1}}}{{{\rm{\pi \beta }}\left( {{\rm{1 + ((x - \mu )/\beta }}{{\rm{)}}^{\rm{2}}}} \right)}}{\rm{,}}\quad {\rm{x\hat I R}}{\rm{.}}\)

As given in the exercise, it can be shown that

\({\rm{V(\tilde X)\gg }}\frac{{\rm{1}}}{{{\rm{4n \times (f(\mu )}}{{\rm{)}}^{\rm{2}}}}}{\rm{,}}\)

where \({\rm{f(\mu )}}\)is

\(\begin{aligned}f(\mu ) &= \frac{{\rm{1}}}{{{\rm{\pi \beta }}\left( {{\rm{1 + }}{{\left( {{{{\rm{(\mu - \mu )}}}^{\rm{2}}}{\rm{/\beta }}} \right)}^{\rm{2}}}} \right)}}\\&= \frac{{\rm{1}}}{{{\rm{\pi \beta }}}}{\rm{.}}\end{aligned}\)

Therefore, the variance now becomes\({\rm{V(\tilde X)\gg }}\frac{{\rm{1}}}{{{\rm{4n \times (f(\mu )}}{{\rm{)}}^{\rm{2}}}}}{\rm{ = }}\frac{{{{\rm{\pi }}^{\rm{2}}}{{\rm{\beta }}^{\rm{2}}}}}{{{\rm{4n}}}}{\rm{.}}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

The accompanying data on flexural strength (MPa) for concrete beams of a certain type was introduced in Example 1.2.

\(\begin{array}{*{20}{r}}{{\rm{5}}{\rm{.9}}}&{{\rm{7}}{\rm{.2}}}&{{\rm{7}}{\rm{.3}}}&{{\rm{6}}{\rm{.3}}}&{{\rm{8}}{\rm{.1}}}&{{\rm{6}}{\rm{.8}}}&{{\rm{7}}{\rm{.0}}}\\{{\rm{7}}{\rm{.6}}}&{{\rm{6}}{\rm{.8}}}&{{\rm{6}}{\rm{.5}}}&{{\rm{7}}{\rm{.0}}}&{{\rm{6}}{\rm{.3}}}&{{\rm{7}}{\rm{.9}}}&{{\rm{9}}{\rm{.0}}}\\{{\rm{3}}{\rm{.2}}}&{{\rm{8}}{\rm{.7}}}&{{\rm{7}}{\rm{.8}}}&{{\rm{9}}{\rm{.7}}}&{{\rm{7}}{\rm{.4}}}&{{\rm{7}}{\rm{.7}}}&{{\rm{9}}{\rm{.7}}}\\{{\rm{7}}{\rm{.3}}}&{{\rm{7}}{\rm{.7}}}&{{\rm{11}}{\rm{.6}}}&{{\rm{11}}{\rm{.3}}}&{{\rm{11}}{\rm{.8}}}&{{\rm{10}}{\rm{.7}}}&{}\end{array}\)

Calculate a point estimate of the mean value of strength for the conceptual population of all beams manufactured in this fashion, and state which estimator you used\({\rm{(Hint:\Sigma }}{{\rm{x}}_{\rm{i}}}{\rm{ = 219}}{\rm{.8}}{\rm{.)}}\)

b. Calculate a point estimate of the strength value that separates the weakest 50% of all such beams from the strongest 50 %, and state which estimator you used.

c. Calculate and interpret a point estimate of the population standard deviation\({\rm{\sigma }}\). Which estimator did you use?\({\rm{(Hint:}}\left. {{\rm{\Sigma x}}_{\rm{i}}^{\rm{2}}{\rm{ = 1860}}{\rm{.94}}{\rm{.}}} \right)\)

d. Calculate a point estimate of the proportion of all such beams whose flexural strength exceeds\({\rm{10MPa}}\). (Hint: Think of an observation as a "success" if it exceeds 10.)

e. Calculate a point estimate of the population coefficient of variation\({\rm{\sigma /\mu }}\), and state which estimator you used.

Let\({\rm{X}}\)have a Weibull distribution with parameters\({\rm{\alpha }}\)and\({\rm{\beta }}\), so

\(\begin{array}{l}{\rm{E(X) = \beta \times \Gamma (1 + 1/\alpha )V(X)}}\\{\rm{ = }}{{\rm{\beta }}^{\rm{2}}}\left\{ {{\rm{\Gamma (1 + 2/\alpha ) - (\Gamma (1 + 1/\alpha )}}{{\rm{)}}^{\rm{2}}}} \right\}\end{array}\)

a. Based on a random sample\({{\rm{X}}_{\rm{1}}}{\rm{, \ldots ,}}{{\rm{X}}_{\rm{n}}}\), write equations for the method of moments estimators of\({\rm{\beta }}\)and\({\rm{\alpha }}\). Show that, once the estimate of\({\rm{\alpha }}\)has been obtained, the estimate of\({\rm{\beta }}\)can be found from a table of the gamma function and that the estimate of\({\rm{\alpha }}\)is the solution to a complicated equation involving the gamma function.

b. If\({\rm{n = 20,\bar x = 28}}{\rm{.0}}\), and\({\rm{\Sigma x}}_{\rm{i}}^{\rm{2}}{\rm{ = 16,500}}\), compute the estimates. (Hint:\(\left. {{{{\rm{(\Gamma (1}}{\rm{.2))}}}^{\rm{2}}}{\rm{/\Gamma (1}}{\rm{.4) = }}{\rm{.95}}{\rm{.}}} \right)\)

When the sample standard deviation S is based on a random sample from a normal population distribution, it can be shown that \({\rm{E(S) = }}\sqrt {{\rm{2/(n - 1)}}} {\rm{\Gamma (n/2)\sigma /\Gamma ((n - 1)/2)}}\)

Use this to obtain an unbiased estimator for \({\rm{\sigma }}\) of the form \({\rm{cS}}\). What is \({\rm{c}}\) when \({\rm{n = 20}}\)?

Let \({{\rm{X}}_{\rm{1}}}{\rm{, \ldots ,}}{{\rm{X}}_{\rm{n}}}\) be a random sample from a pdf that is symmetric about \({\rm{\mu }}\). An estimator for \({\rm{\mu }}\) that has been found to perform well for a variety of underlying distributions is the Hodges–Lehmann estimator. To define it, first compute for each \({\rm{i}} \le {\rm{j}}\) and each \({\rm{j = 1,2, \ldots ,n}}\) the pairwise average \({{\rm{\bar X}}_{{\rm{i,j}}}}{\rm{ = }}\left( {{{\rm{X}}_{\rm{i}}}{\rm{ + }}{{\rm{X}}_{\rm{j}}}} \right){\rm{/2}}\). Then the estimator is \({\rm{\hat \mu = }}\) the median of the \({{\rm{\bar X}}_{{\rm{i,j}}}}{\rm{'s}}\). Compute the value of this estimate using the data of Exercise \({\rm{44}}\) of Chapter \({\rm{1}}\). (Hint: Construct a square table with the \({{\rm{x}}_{\rm{i}}}{\rm{'s}}\) listed on the left margin and on top. Then compute averages on and above the diagonal.)

Each of 150 newly manufactured items is examined and the number of scratches per item is recorded (the items are supposed to be free of scratches), yielding the following data:

Assume that X has a Poisson distribution with parameter \({\bf{\mu }}.\)and that X represents the number of scratches on a randomly picked item.

a. Calculate the estimate for the data using an unbiased \({\bf{\mu }}.\)estimator. (Hint: for X Poisson, \({\rm{E(X) = \mu }}\) ,therefore \({\rm{E(\bar X) = ?)}}\)

c. What is your estimator's standard deviation (standard error)? Calculate the standard error estimate. (Hint: \({\rm{\sigma }}_{\rm{X}}^{\rm{2}}{\rm{ = \mu }}\), \({\rm{X}}\))

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free