Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Problem 1

Let \(R\) be binomial with probability \(\pi\) and denominator \(m\), and consider estimators of \(\pi\) of form \(T=(R+a) /(m+b)\), for \(a, b \geq 0\). Find a condition under which \(T\) has lower mean squared error than the maximum likelihood estimator \(R / m\), and discuss which is preferable when \(m=5,10\).

Problem 2

Let \(T=a \sum\left(Y_{j}-\bar{Y}\right)^{2}\) be an estimator of \(\sigma^{2}\) based on a normal random sample. Find values of \(a\) that minimize the bias and mean squared error of \(T\).

Problem 2

Let \(Y_{1}, \ldots, Y_{n}\) be a random sample from an unknown density \(f\). Let \(I_{j}\) indicate whether or not \(Y_{j}\) lies in the interval ( \(\left.a-\frac{1}{2} h, a+\frac{1}{2} h\right]\), and consider \(R=\sum I_{j}\). Show that \(R\) has a binomial distribution with denominator \(n\) and probability $$ \int_{a-\frac{1}{2} h}^{a+\frac{1}{2} h} f(y) d y $$ Hence show that \(R /(n h)\) has approximate mean and variance \(f(a)+\frac{1}{2} h^{2} f^{\prime \prime}(a)\) and \(f(a) / n h\), where \(f^{\prime \prime}\) is the second derivative of \(f\). What implications have these results for using the histogram to estimate \(f(a)\) ?

Problem 3

If \(U \sim U(0,1)\), show that \(\min (U, 1-U) \sim U\left(0, \frac{1}{2}\right)\). Hence justify the computation of a two-sided significance level as \(2 \min \left(P^{-}, P^{+}\right)\).

Problem 3

Suppose that the random variables \(Y_{1}, \ldots, Y_{n}\) are such that $$ \mathrm{E}\left(Y_{j}\right)=\mu, \quad \operatorname{var}\left(Y_{j}\right)=\sigma_{j}^{2}, \quad \operatorname{cov}\left(Y_{j}, Y_{k}\right)=0, \quad j \neq k $$ where \(\mu\) is unknown and the \(\sigma_{j}^{2}\) are known. Show that the linear combination of the \(Y_{j}\) 's giving an unbiased estimator of \(\mu\) with minimum variance is $$ \sum_{j=1}^{n} \sigma_{j}^{-2} Y_{j} / \sum_{j=1}^{n} \sigma_{j}^{-2} $$ Suppose now that \(Y_{j}\) is normally distributed with mean \(\beta x_{j}\) and unit variance, and that the \(Y_{j}\) are independent, with \(\beta\) an unknown parameter and the \(x_{j}\) known constants. Which of the estimators $$ T_{1}=n^{-1} \sum_{j=1}^{n} Y_{j} / x_{j}, \quad T_{2}=\sum_{j=1}^{n} Y_{j} x_{j} / \sum_{j=1}^{n} x_{j}^{2} $$ is preferable and why?

Problem 4

In \(n\) independent food samples the bacterial counts \(Y_{1}, \ldots, Y_{n}\) are presumed to be Poisson random variables with mean \(\theta\). It is required to estimate the probability that a given sample would be uncontaminated, \(\pi=\operatorname{Pr}\left(Y_{j}=0\right)\). Show that \(U=n^{-1} \sum I\left(Y_{j}=0\right)\), the proportion of the samples uncontaminated, is unbiased for \(\pi\), and find its variance. Using the Rao- Blackwell theorem or otherwise, show that an unbiased estimator of \(\pi\) having smaller variance than \(U\) is \(V=\\{(n-1) / n\\}^{n \bar{Y}}\), where \(\bar{Y}=n^{-1} \sum Y_{j} .\) Is this a minimum variance unbiased estimator of \(\pi\) ? Find \(\operatorname{var}(V)\) and hence give the asymptotic efficiency of \(U\) relative to \(V\).

Problem 5

Let \(Y_{1}, \ldots, Y_{n}\) be independent Poisson variables with means \(x_{1} \beta, \ldots, x_{n} \beta\), where \(\beta>0\) is an unknown scalar and the \(x_{j}>0\) are known scalars. Show that \(T=\sum Y_{j} x_{j} / \sum x_{j}^{2}\) is an unbiased estimator of \(\beta\) and find its variance. Find a minimal sufficient statistic \(S\) for \(\beta\), and show that the conditional distribution of \(Y_{j}\) given that \(S=s\) is multinomial with mean \(s x_{j} / \sum_{i} x_{i} .\) Hence find the minimum variance unbiased estimator of \(\beta .\) Is it unique?

Problem 6

Consider testing the hypothesis that a binomial random variable has probability \(\pi=1 / 2\) against the alternative that \(\pi>1 / 2\). For what values of \(\alpha\) does a uniformly most powerful test exist when the denominator is \(m=5\) ?

Problem 6

Given that there is a \(1-1\) mapping between \(x_{1}<\cdots

Problem 7

Find minimum variance unbiased estimators of \(\lambda^{2}, e^{\lambda}\), and \(e^{-n \lambda}\) based on a random sample \(Y_{1}, \ldots, Y_{n}\) from a Poisson density with mean \(\lambda\). Show that no unbiased estimator of \(\log \lambda\) exists.

Access millions of textbook solutions in one place

  • Access over 3 million high quality textbook solutions
  • Access our popular flashcard, quiz, mock-exam and notes features
  • Access our smart AI features to upgrade your learning
Get Vaia Premium now
Access millions of textbook solutions in one place

Recommended explanations on Math Textbooks