Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Given \(f(x ; \theta)=1 / \theta, 00\), formally compute the reciprocal of $$ n E\left\\{\left[\frac{\partial \log f(X: \theta)}{\partial \theta}\right]^{2}\right\\} $$ Compare this with the variance of \((n+1) Y_{n} / n\), where \(Y_{n}\) is the largest observation of a random sample of size \(n\) from this distribution. Comment.

Short Answer

Expert verified
The reciprocal of the expectation of the square of the derivative of the logarithm of \(f(x;\theta)\) with respect to \(\theta\) is \(\theta\). The variance of \((n+1)Y_n/n\) where \(Y_n\) is the largest observation in a random sample of size \(n\) is \((n+1)^2/n^2\). The two expressions do not coincide.

Step by step solution

01

Computing derivative

The first step is to compute the derivative of the logarithm of \(f(x;\theta)\) with respect to \(\theta\). The function \(f(x;\theta) = 1/ \theta\) has the domain \(0 < x < \theta\) and its logarithm is \(\log f(x;\theta) = log(1/\theta) = -log \theta\). The derivative of this with respect to \(\theta\) is \(-1/\theta\). The square of the derivative is \(-1/\theta^2\).
02

Computing expectation

The next step is to compute the expectation \(E\left\{ \left[ -1/\theta \right]^2 \right\}\). This is given by \(\int_0^\theta 1/ \theta^2 dx\). Since the integrand does not depend on \(x\), the integral is the integrand times \(\theta\) which results in \(1/\theta\). The reciprocal of this is \(\theta\).
03

Find the variance

The final step is to compute the variance of \((n+1)Y_n/n\) where \(Y_n\) is the largest observation in a random sample of size \(n\). The variance of \((n+1)Y_n/n = (n+1)Y_n/n^2\delta\) is given by the expectation of the square of \(\frac{(n+1)^2 Y_n^2}{n^2}\) both of which are equal to unity. Thus, the variance of \((n+1)Y_n/n\) is \((n+1)^2/n^2\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

A survey is taken of the citizens in a city as to whether or not they support the zoning plan that the city council is considering. The responses are: Yes, No, Indifferent, and Otherwise. Let \(p_{1}, p_{2}, p_{3}\), and \(p_{4}\) denote the respective true probabilities of these responses. The results of the survey are: $$ \begin{array}{|c|c|c|c|} \hline \text { Yes } & \text { No } & \text { Indifferent } & \text { Otherwise } \\ \hline 60 & 45 & 70 & 25 \\ \hline \end{array} $$ (a) Obtain the mles of \(p_{i}, i=1, \ldots, 4\). (b) Obtain \(95 \%\) confidence intervals, \((4.2 .7)\), for \(p_{i}, i=1, \ldots, 4\).

Rao (page 368,1973 ) considers a problem in the estimation of linkages in genetics. McLachlan and Krishnan (1997) also discuss this problem and we present their model. For our purposes, it can be described as a multinomial model with the four categories \(C_{1}, C_{2}, C_{3}\), and \(C_{4}\). For a sample of size \(n\), let \(\mathbf{X}=\left(X_{1}, X_{2}, X_{3}, X_{4}\right)^{\prime}\) denote the observed frequencies of the four categories. Hence, \(n=\sum_{i=1}^{4} X_{i} .\) The probability model is $$ \begin{array}{|c|c|c|c|} \hline C_{1} & C_{2} & C_{3} & C_{4} \\ \hline \frac{1}{2}+\frac{1}{4} \theta & \frac{1}{4}-\frac{1}{4} \theta & \frac{1}{4}-\frac{1}{4} \theta & \frac{1}{4} \theta \\ \hline \end{array} $$ where the parameter \(\theta\) satisfies \(0 \leq \theta \leq 1\). In this exercise, we obtain the mle of \(\theta\). (a) Show that likelihood function is given by $$ L(\theta \mid \mathbf{x})=\frac{n !}{x_{1} ! x_{2} ! x_{3} ! x_{4} !}\left[\frac{1}{2}+\frac{1}{4} \theta\right]^{x_{1}}\left[\frac{1}{4}-\frac{1}{4} \theta\right]^{x_{2}+x_{3}}\left[\frac{1}{4} \theta\right]^{x_{4}} $$ (b) Show that the log of the likelihood function can be expressed as a constant (not involving parameters) plus the term $$ x_{1} \log [2+\theta]+\left[x_{2}+x_{3}\right] \log [1-\theta]+x_{4} \log \theta $$ (c) Obtain the partial derivative with respect to \(\theta\) of the last expression, set the result to 0 , and solve for the mle. (This will result in a quadratic equation that has one positive and one negative root.)

Suppose \(X_{1}, \ldots, X_{n}\) are iid with pdf \(f(x ; \theta)=2 x / \theta^{2}, \quad 0

Consider two Bernoulli distributions with unknown parameters \(p_{1}\) and \(p_{2}\). If \(Y\) and \(Z\) equal the numbers of successes in two independent random samples, each of size \(n\), from the respective distributions, determine the mles of \(p_{1}\) and \(p_{2}\) if we know that \(0 \leq p_{1} \leq p_{2} \leq 1\)

Let \(Y_{1}0\). (a) Show that \(\Lambda\) for testing \(H_{0}: \theta=\theta_{0}\) against \(H_{1}: \theta \neq \theta_{0}\) is \(\Lambda=\left(Y_{n} / \theta_{0}\right)^{n}\), \(Y_{n} \leq \theta_{0}\), and \(\Lambda=0\) if \(Y_{n}>\theta_{0}\) (b) When \(H_{0}\) is true, show that \(-2 \log \Lambda\) has an exact \(\chi^{2}(2)\) distribution, not \(\chi^{2}(1) .\) Note that the regularity conditions are not satisfied.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free