Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

(Bonferroni Multiple Comparison Procedure). In the notation of this section, let \(\left(k_{i 1}, k_{i 2}, \ldots, k_{i b}\right), i=1,2, \ldots, m\), represent a finite number of \(b\) -tuples. The problem is to find simultaneous confidence intervals for \(\sum_{j=1}^{b} k_{i j} \mu_{j}, i=1,2, \ldots, m\), by a method different from that of Scheffé. Define the random variable \(T_{i}\) by $$ \left(\sum_{j=1}^{b} k_{i j} \bar{X}_{. j}-\sum_{j=1}^{b} k_{i j} \mu_{j}\right) / \sqrt{\left(\sum_{j=1}^{b} k_{i j}^{2}\right) V / a}, \quad i=1,2, \ldots, m $$ (a) Let the event \(A_{i}^{c}\) be given by \(-c_{i} \leq T_{i} \leq c_{i}, i=1,2, \ldots, m\). Find the random variables \(U_{i}\) and \(W_{i}\) such that \(U_{i} \leq \sum_{1}^{b} k_{i j} \mu_{j} \leq W_{j}\) is equivalent to \(A_{i}^{c}\) (b) Select \(c_{i}\) such that \(P\left(A_{i}^{c}\right)=1-\alpha / m ;\) that is, \(P\left(A_{i}\right)=\alpha / m .\) Use Exercise 9.4.1 to determine a lower bound on the probability that simultaneously the random intervals \(\left(U_{1}, W_{1}\right), \ldots,\left(U_{m}, W_{m}\right)\) include \(\sum_{j=1}^{b} k_{1 j} \mu_{j}, \ldots, \sum_{j=1}^{b} k_{m j} \mu_{j}\) respectively. (c) Let \(a=3, b=6\), and \(\alpha=0.05\). Consider the linear functions \(\mu_{1}-\mu_{2}, \mu_{2}-\mu_{3}\), \(\mu_{3}-\mu_{4}, \mu_{4}-\left(\mu_{5}+\mu_{6}\right) / 2\), and \(\left(\mu_{1}+\mu_{2}+\cdots+\mu_{6}\right) / 6 .\) Here \(m=5 .\) Show that the lengths of the confidence intervals given by the results of Part (b) are shorter than the corresponding ones given by the method of Scheffé as described in the text. If \(m\) becomes sufficiently large, however, this is not the case.

Short Answer

Expert verified
In this Bonferroni procedure, the random variables \(U_i\) and \(W_i\) are derived first from the condition of the given event \(A_i^c\). With a set value for \(c_i\), the probability of the intervals \(U_i\) and \(W_i\) is then determined applying Exercise 9.4.1. Lastly, we compare the results with the Scheffé method to determine which one yields shorter confidence intervals.

Step by step solution

01

Finding Random Variables \(U_i\) and \(W_i\)

To find \(U_i\) and \(W_i\) we use the given event \(A_i^c\), which states \(-c_i \leq T_i = \leq c_i\). We can rewrite \(T_i\), and use reverse algebra to solve for \(\sum_{j=1}^{b} k_{ij} \mu_j\). Thus, our equivalent condition \(U_i \leq \sum_{j=1}^{b} k_{ij} \mu_j \leq W_i\) can be written implying \(U_i\) and \(W_i\) in terms of the given variables.
02

Selecting \(c_i\) and Applying Exercise 9.4.1

In this part, \(c_i\) is selected such that the probability \(P(A_i^c) = 1 - \alpha / m\). From this, we can calculate the probability \(P(A_i)\). With these set, we then use Exercise 9.4.1 to determine a lower bound on the probability that the random intervals \((U_1, W_1), ...,(U_m, W_m)\) include \(\sum_{j=1}^{b} k_{ij} \mu_j, ..., \sum_{j=1}^{b} k_{mj} \mu_j\) respectively.
03

Comparing with Scheffé's Method

Given three variables \(a\), \(b\) and \(\alpha\), we use these in our linear functions. After calculating the required values, we compare the lengths of the confidence intervals given by the results of Part (b) with the corresponding ones given by the Scheffé method and determine which yields shorter intervals.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Assume that the sample \(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n}, Y_{n}\right)\) follows the linear model \((9.6 .1)\). Suppose \(Y_{0}\) is a future observation at \(x=x_{0}-\bar{x}\) and we want to determine a predictive interval for it. Assume that the model \((9.6 .1)\) holds for \(Y_{0}\); i.e., \(Y_{0}\) has a \(N\left(\alpha+\beta\left(x_{0}-\bar{x}\right), \sigma^{2}\right)\) distribution. We will use \(\hat{\eta}_{0}\) of Exercise \(9.6 .4\) as our prediction of \(Y_{0}\) (a) Obtain the distribution of \(Y_{0}-\hat{\eta}_{0}\). Use the fact that the future observation \(Y_{0}\) is independent of the sample \(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n}, Y_{n}\right)\) (b) Determine a \(t\) -statistic with numerator \(Y_{0}-\hat{\eta}_{0}\). (c) Now beginning with \(1-\alpha=P\left[-t_{\alpha / 2, n-2}

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample of size \(n\) from a distribution which is \(N\left(0, \sigma^{2}\right)\). Prove that \(\sum_{1}^{n} X_{i}^{2}\) and every quadratic form, which is nonidentically zero in \(X_{1}, X_{2}, \ldots, X_{n}\), are dependent.

The driver of a diesel-powered automobile decided to test the quality of three types of diesel fuel sold in the area based on mpg. Test the null hypothesis that the three means are equal using the following data. Make the usual assumptions and take \(\alpha=0.05\). $$ \begin{array}{llllll} \text { Brand A: } & 38.7 & 39.2 & 40.1 & 38.9 & \\ \text { Brand B: } & 41.9 & 42.3 & 41.3 & & \\ \text { Brand C: } & 40.8 & 41.2 & 39.5 & 38.9 & 40.3 \end{array} $$

Let \(X_{1}\) and \(X_{2}\) be two independent random variables. Let \(X_{1}\) and \(Y=\) \(X_{1}+X_{2}\) be \(\chi^{2}\left(r_{1}, \theta_{1}\right)\) and \(\chi^{2}(r, \theta)\), respectively. Here \(r_{1}

With the background of the two-way classification with \(c>1\) observations per cell, show that the maximum likelihood estimators of the parameters are $$ \begin{aligned} \hat{\alpha}_{i} &=\bar{X}_{i . .}-\bar{X}_{\ldots} \\ \hat{\beta}_{j} &=\bar{X}_{. j .}-\bar{X}_{\cdots} \\ \hat{\gamma}_{i j} &=\bar{X}_{i j .}-\bar{X}_{i .}-\bar{X}_{. j}+\bar{X}_{\ldots} \\ \hat{\mu} &=\bar{X}_{\ldots} \end{aligned} $$ Show that these are unbiased estimators of the respective parameters. Compute the variance of each estimator.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free