Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

(Bonferroni Multiple Comparison Procedure). In the notation of this section, let \(\left(k_{i 1}, k_{i 2}, \ldots, k_{i b}\right), i=1,2, \ldots, m\), represent a finite number of \(b\) -tuples. The problem is to find simultaneous confidence intervals for \(\sum_{j=1}^{b} k_{i j} \mu_{j}, i=1,2, \ldots, m\), by a method different from that of Scheffé. Define the random variable \(T_{i}\) by $$ \left(\sum_{j=1}^{b} k_{i j} \bar{X}_{. j}-\sum_{j=1}^{b} k_{i j} \mu_{j}\right) / \sqrt{\left(\sum_{j=1}^{b} k_{i j}^{2}\right) V / a}, \quad i=1,2, \ldots, m $$ (a) Let the event \(A_{i}^{c}\) be given by \(-c_{i} \leq T_{i} \leq c_{i}, i=1,2, \ldots, m\). Find the random variables \(U_{i}\) and \(W_{i}\) such that \(U_{i} \leq \sum_{1}^{b} k_{i j} \mu_{j} \leq W_{j}\) is equivalent to \(A_{i}^{c}\) (b) Select \(c_{i}\) such that \(P\left(A_{i}^{c}\right)=1-\alpha / m ;\) that is, \(P\left(A_{i}\right)=\alpha / m .\) Use Exercise 9.4.1 to determine a lower bound on the probability that simultaneously the random intervals \(\left(U_{1}, W_{1}\right), \ldots,\left(U_{m}, W_{m}\right)\) include \(\sum_{j=1}^{b} k_{1 j} \mu_{j}, \ldots, \sum_{j=1}^{b} k_{m j} \mu_{j}\) respectively. (c) Let \(a=3, b=6\), and \(\alpha=0.05\). Consider the linear functions \(\mu_{1}-\mu_{2}, \mu_{2}-\mu_{3}\), \(\mu_{3}-\mu_{4}, \mu_{4}-\left(\mu_{5}+\mu_{6}\right) / 2\), and \(\left(\mu_{1}+\mu_{2}+\cdots+\mu_{6}\right) / 6 .\) Here \(m=5 .\) Show that the lengths of the confidence intervals given by the results of Part (b) are shorter than the corresponding ones given by the method of Scheffé as described in the text. If \(m\) becomes sufficiently large, however, this is not the case.

Short Answer

Expert verified
In this Bonferroni procedure, the random variables \(U_i\) and \(W_i\) are derived first from the condition of the given event \(A_i^c\). With a set value for \(c_i\), the probability of the intervals \(U_i\) and \(W_i\) is then determined applying Exercise 9.4.1. Lastly, we compare the results with the Scheffé method to determine which one yields shorter confidence intervals.

Step by step solution

01

Finding Random Variables \(U_i\) and \(W_i\)

To find \(U_i\) and \(W_i\) we use the given event \(A_i^c\), which states \(-c_i \leq T_i = \leq c_i\). We can rewrite \(T_i\), and use reverse algebra to solve for \(\sum_{j=1}^{b} k_{ij} \mu_j\). Thus, our equivalent condition \(U_i \leq \sum_{j=1}^{b} k_{ij} \mu_j \leq W_i\) can be written implying \(U_i\) and \(W_i\) in terms of the given variables.
02

Selecting \(c_i\) and Applying Exercise 9.4.1

In this part, \(c_i\) is selected such that the probability \(P(A_i^c) = 1 - \alpha / m\). From this, we can calculate the probability \(P(A_i)\). With these set, we then use Exercise 9.4.1 to determine a lower bound on the probability that the random intervals \((U_1, W_1), ...,(U_m, W_m)\) include \(\sum_{j=1}^{b} k_{ij} \mu_j, ..., \sum_{j=1}^{b} k_{mj} \mu_j\) respectively.
03

Comparing with Scheffé's Method

Given three variables \(a\), \(b\) and \(\alpha\), we use these in our linear functions. After calculating the required values, we compare the lengths of the confidence intervals given by the results of Part (b) with the corresponding ones given by the Scheffé method and determine which yields shorter intervals.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

If \(A_{1}, A_{2}, \ldots, A_{k}\) are events, prove, by induction, Boole's inequality $$ P\left(A_{1} \cup A_{2} \cup \cdots \cup A_{k}\right) \leq \sum_{1}^{k} P\left(A_{i}\right) $$ Then show that $$ P\left(A_{1}^{c} \cap A_{2}^{c} \cap \cdots \cap A_{k}^{c}\right) \geq 1-\sum_{1}^{b} P\left(A_{i}\right) $$

Let the independent normal random variables \(Y_{1}, Y_{2}, \ldots, Y_{n}\) have, respectively, the probability density functions \(N\left(\mu, \gamma^{2} x_{i}^{2}\right), i=1,2, \ldots, n\), where the given \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal and no one of which is zero. Discuss the test of the hypothesis \(H_{0}: \gamma=1, \mu\) unspecified, against all alternatives \(H_{1}: \gamma \neq 1, \mu\) unspecified.

With the background of the two-way classification with \(c>1\) observations per cell, show that the maximum likelihood estimators of the parameters are $$ \begin{aligned} \hat{\alpha}_{i} &=\bar{X}_{i . .}-\bar{X}_{\ldots} \\ \hat{\beta}_{j} &=\bar{X}_{. j .}-\bar{X}_{\cdots} \\ \hat{\gamma}_{i j} &=\bar{X}_{i j .}-\bar{X}_{i .}-\bar{X}_{. j}+\bar{X}_{\ldots} \\ \hat{\mu} &=\bar{X}_{\ldots} \end{aligned} $$ Show that these are unbiased estimators of the respective parameters. Compute the variance of each estimator.

Let \(X_{1 j}, X_{2 j}, \ldots, X_{a_{f} j}\) represent independent random samples of sizes \(a_{j}\) from a normal distribution with means \(\mu_{j}\) and variances \(\sigma^{2}, j=1,2, \ldots, b\). Show that $$ \sum_{j=1}^{b} \sum_{i=1}^{a_{j}}\left(X_{i j}-\bar{X}_{. .}\right)^{2}=\sum_{j=1}^{b} \sum_{i=1}^{a_{j}}\left(X_{i j}-\bar{X}_{. j}\right)^{2}+\sum_{j=1}^{b} a_{j}\left(\bar{X}_{. j}-\bar{X}_{. .}\right)^{2} $$ or \(Q^{\prime}=Q_{3}^{\prime}+Q_{4}^{\prime} .\) Here \(\bar{X}_{. .}=\sum_{j=1}^{b} \sum_{i=1}^{a_{j}} X_{i j} / \sum_{j=1}^{b} a_{j}\) and \(\bar{X}_{. j}=\sum_{i=1}^{a_{j}} X_{i j} / a_{j} .\) If \(\mu_{1}=\mu_{2}=\) \(\cdots=\mu_{b}\), show that \(Q^{\prime} / \sigma^{2}\) and \(Q_{3}^{\prime} / \sigma^{2}\) have chi-square distributions. Prove that \(Q_{3}^{\prime}\) and \(Q_{4}^{\prime}\) are independent, and hence \(Q_{4}^{\prime} / \sigma^{2}\) also has a chi-square distribution. If the likelihood ratio \(\Lambda\) is used to test \(H_{0}: \mu_{1}=\mu_{2}=\cdots=\mu_{b}=\mu, \mu\) unspecified and \(\sigma^{2}\) unknown against all possible alternatives, show that \(\Lambda \leq \lambda_{0}\) is equivalent to the computed \(F \geq c\), where $$ F=\frac{\left(\sum_{j=1}^{b} a_{j}-b\right) Q_{4}^{\prime}}{(b-1) Q_{3}^{\prime}} $$ What is the distribution of \(F\) when \(H_{0}\) is true?

Let \(Q=X_{1} X_{2}-X_{3} X_{4}\), where \(X_{1}, X_{2}, X_{3}, X_{4}\) is a random sample of size 4 from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Show that \(Q / \sigma^{2}\) does not have a chi-square distribution. Find the mgf of \(Q / \sigma^{2}\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free