Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Rao (page 368,1973 ) considers a problem in the estimation of linkages in genetics. McLachlan and Krishnan (1997) also discuss this problem and we present their model. For our purposes, it can be described as a multinomial model with the four categories \(C_{1}, C_{2}, C_{3}\), and \(C_{4}\). For a sample of size \(n\), let \(\mathbf{X}=\left(X_{1}, X_{2}, X_{3}, X_{4}\right)^{\prime}\) denote the observed frequencies of the four categories. Hence, \(n=\sum_{i=1}^{4} X_{i} .\) The probability model is $$ \begin{array}{|c|c|c|c|} \hline C_{1} & C_{2} & C_{3} & C_{4} \\ \hline \frac{1}{2}+\frac{1}{4} \theta & \frac{1}{4}-\frac{1}{4} \theta & \frac{1}{4}-\frac{1}{4} \theta & \frac{1}{4} \theta \\ \hline \end{array} $$ where the parameter \(\theta\) satisfies \(0 \leq \theta \leq 1\). In this exercise, we obtain the mle of \(\theta\). (a) Show that likelihood function is given by $$ L(\theta \mid \mathbf{x})=\frac{n !}{x_{1} ! x_{2} ! x_{3} ! x_{4} !}\left[\frac{1}{2}+\frac{1}{4} \theta\right]^{x_{1}}\left[\frac{1}{4}-\frac{1}{4} \theta\right]^{x_{2}+x_{3}}\left[\frac{1}{4} \theta\right]^{x_{4}} $$ (b) Show that the log of the likelihood function can be expressed as a constant (not involving parameters) plus the term $$ x_{1} \log [2+\theta]+\left[x_{2}+x_{3}\right] \log [1-\theta]+x_{4} \log \theta $$ (c) Obtain the partial derivative with respect to \(\theta\) of the last expression, set the result to 0 , and solve for the mle. (This will result in a quadratic equation that has one positive and one negative root.)

Short Answer

Expert verified
The exercise involves three main steps: constructing the likelihood function, expressing the log-likelihood, and finding the mle of \(\theta\). The mle is obtained by differentiating the log-likelihood with respect to \(\theta\), equating to 0, and solving the resulting quadratic equation.

Step by step solution

01

Title: Compute the Likelihood Function

Given the probabilities \(p_i\) for the four categories \(C_i\), construct the likelihood function. The likelihood is constructed as the probabilities of obtaining the observed frequencies, \(x_i\). It's given by: \[ L(\theta \mid \mathbf{x})=\frac{n !}{x_{1} ! x_{2} ! x_{3} ! x_{4} !}\left[\frac{1}{2}+\frac{1}{4} \theta\right]^{x_{1}}\left[\frac{1}{4}-\frac{1}{4} \theta\right]^{x_{2}+x_{3}}\left[\frac{1}{4} \theta\right]^{x_{4}} \]
02

Title: Express the natural logarithm of the likelihood

The log-likelihood is often used as it is easier to differentiate. Taking the log of the likelihood function, you obtain: \[ x_{1} \log [2+\theta]+\left[x_{2}+x_{3}\right] \log [1-\theta]+x_{4} \log \theta \]
03

Title: Compute the Maximum Likelihood Estimation of the Parameter \(\theta\)

The maximum likelihood estimate of a parameter is obtained by taking the derivative of the log-likelihood function with respect to the parameter and setting it equal to 0. Doing this for \(\theta\) results in a quadratic equation. The solution to this equation yields the mle.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(Y_{1}

Let \(Y_{1}0\). (a) Show that \(\Lambda\) for testing \(H_{0}: \theta=\theta_{0}\) against \(H_{1}: \theta \neq \theta_{0}\) is \(\Lambda=\left(Y_{n} / \theta_{0}\right)^{n}\), \(Y_{n} \leq \theta_{0}\), and \(\Lambda=0\) if \(Y_{n}>\theta_{0}\) (b) When \(H_{0}\) is true, show that \(-2 \log \Lambda\) has an exact \(\chi^{2}(2)\) distribution, not \(\chi^{2}(1) .\) Note that the regularity conditions are not satisfied.

Let \(n\) independent trials of an experiment be such that \(x_{1}, x_{2}, \ldots, x_{k}\) are the respective numbers of times that the experiment ends in the mutually exclusive and exhaustive events \(C_{1}, C_{2}, \ldots, C_{k} .\) If \(p_{i}=P\left(C_{i}\right)\) is constant throughout the \(n\) trials, then the probability of that particular sequence of trials is \(L=p_{1}^{x_{1}} p_{2}^{x_{2}} \cdots p_{k}^{x_{k}}\). (a) Recalling that \(p_{1}+p_{2}+\cdots+p_{k}=1\), show that the likelihood ratio for testing \(H_{0}: p_{i}=p_{i 0}>0, i=1,2, \ldots, k\), against all alternatives is given by $$ \Lambda=\prod_{i=1}^{k}\left(\frac{\left(p_{i 0}\right)^{x_{i}}}{\left(x_{i} / n\right)^{x_{i}}}\right) $$ (b) Show that $$ -2 \log \Lambda=\sum_{i=1}^{k} \frac{x_{i}\left(x_{i}-n p_{i 0}\right)^{2}}{\left(n p_{i}^{\prime}\right)^{2}} $$ where \(p_{i}^{\prime}\) is between \(p_{i 0}\) and \(x_{i} / n\). Hint: Expand \(\log p_{i 0}\) in a Taylor's series with the remainder in the term involving \(\left(p_{i 0}-x_{i} / n\right)^{2}\). (c) For large \(n\), argue that \(x_{i} /\left(n p_{i}^{\prime}\right)^{2}\) is approximated by \(1 /\left(n p_{i 0}\right)\) and hence \(-2 \log \Lambda \approx \sum_{i=1}^{k} \frac{\left(x_{i}-n p_{i 0}\right)^{2}}{n p_{i 0}}\) when \(H_{0}\) is true. Theorem \(6.5 .1\) says that the right-hand member of this last equation defines a statistic that has an approximate chi-square distribution with \(k-1\) degrees of freedom. Note that dimension of \(\Omega-\) dimension of \(\omega=(k-1)-0=k-1\)

Let \(X\) and \(Y\) be two independent random variables with respective pdfs $$ f\left(x ; \theta_{i}\right)=\left\\{\begin{array}{ll} \left(\frac{1}{\theta_{i}}\right) e^{-x / \theta_{i}} & 0

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a Bernoulli \(b(1, \theta)\) distribution, where \(0 \leq \theta<1\). (a) Show that the likelihood ratio test of \(H_{0}: \theta=\theta_{0}\) versus \(H_{1}: \theta \neq \theta_{0}\) is based upon the statistic \(Y=\sum_{i=1}^{n} X_{i} .\) Obtain the null distribution of \(Y\). (b) For \(n=100\) and \(\theta_{0}=1 / 2\), find \(c_{1}\) so that the test rejects \(H_{0}\) when \(Y \leq c_{1}\) or \(Y \geq c_{2}=100-c_{1}\) has the approximate significance level of \(\alpha=0.05 .\) Hint: Use the Central Limit Theorem.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free