Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(Q=X_{1} X_{2}-X_{3} X_{4}\), where \(X_{1}, X_{2}, X_{3}, X_{4}\) is a random sample of size 4 from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Show that \(Q / \sigma^{2}\) does not have a chi-square distribution. Find the mgf of \(Q / \sigma^{2}\).
Let the independent random variables \(Y_{1}, Y_{2}, \ldots, Y_{n}\) have, respectively, the probability density functions \(N\left(\beta x_{i}, \gamma^{2} x_{i}^{2}\right), i=1,2, \ldots, n\), where the given numbers \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal and no one is zero. Find the maximum likelihood estimators of \(\beta\) and \(\gamma^{2}\).
Using the background of the two-way classification with one observation per cell, show that the maximum likelihood estimator of \(\alpha_{i}, \beta_{j}\), and \(\mu\) are \(\hat{\alpha}_{i}=\bar{X}_{i .}-\bar{X}_{. .}\) \(\hat{\beta}_{j}=\bar{X}_{. j}-\bar{X}_{. .}\), and \(\hat{\mu}=\bar{X}_{. .}\), respectively. Show that these are unbiased estimators of their respective parameters and compute \(\operatorname{var}\left(\hat{\alpha}_{i}\right), \operatorname{var}\left(\hat{\beta}_{j}\right)\), and \(\operatorname{var}(\hat{\mu})\).
Let \(X_{1 j}, X_{2 j}, \ldots, X_{a_{f} j}\) represent independent random samples of sizes \(a_{j}\) from a normal distribution with means \(\mu_{j}\) and variances \(\sigma^{2}, j=1,2, \ldots, b\). Show that $$ \sum_{j=1}^{b} \sum_{i=1}^{a_{j}}\left(X_{i j}-\bar{X}_{. .}\right)^{2}=\sum_{j=1}^{b} \sum_{i=1}^{a_{j}}\left(X_{i j}-\bar{X}_{. j}\right)^{2}+\sum_{j=1}^{b} a_{j}\left(\bar{X}_{. j}-\bar{X}_{. .}\right)^{2} $$ or \(Q^{\prime}=Q_{3}^{\prime}+Q_{4}^{\prime} .\) Here \(\bar{X}_{. .}=\sum_{j=1}^{b} \sum_{i=1}^{a_{j}} X_{i j} / \sum_{j=1}^{b} a_{j}\) and \(\bar{X}_{. j}=\sum_{i=1}^{a_{j}} X_{i j} / a_{j} .\) If \(\mu_{1}=\mu_{2}=\) \(\cdots=\mu_{b}\), show that \(Q^{\prime} / \sigma^{2}\) and \(Q_{3}^{\prime} / \sigma^{2}\) have chi-square distributions. Prove that \(Q_{3}^{\prime}\) and \(Q_{4}^{\prime}\) are independent, and hence \(Q_{4}^{\prime} / \sigma^{2}\) also has a chi-square distribution. If the likelihood ratio \(\Lambda\) is used to test \(H_{0}: \mu_{1}=\mu_{2}=\cdots=\mu_{b}=\mu, \mu\) unspecified and \(\sigma^{2}\) unknown against all possible alternatives, show that \(\Lambda \leq \lambda_{0}\) is equivalent to the computed \(F \geq c\), where $$ F=\frac{\left(\sum_{j=1}^{b} a_{j}-b\right) Q_{4}^{\prime}}{(b-1) Q_{3}^{\prime}} $$ What is the distribution of \(F\) when \(H_{0}\) is true?
Show that \(\sum_{j=1}^{b} \sum_{i=1}^{a}\left(X_{i j}-\bar{X}_{i .}\right)^{2}=\sum_{j=1}^{b} \sum_{i=1}^{a}\left(X_{i j}-\bar{X}_{i}-\bar{X}_{. j}+\bar{X}_{. .}\right)^{2}+a \sum_{j=1}^{b}\left(\bar{X}_{. j}-\bar{X}_{. .}\right)^{2} .\)
What do you think about this solution?
We value your feedback to improve our textbook solutions.