Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Show that the power function of the sign test is nonincreasing for the hypotheses $$ H_{0}: \theta=\theta_{0} \text { versus } H_{1}: \theta<\theta_{0} $$

Short Answer

Expert verified
The power function of the sign test is shown to be nonincreasing for the given hypotheses because as the actual parameter \(\theta\) decreases, implying an increase in the probability of negative differences \(p\), the power function - calculated as the cumulative binomial distribution function evaluated at given count \(\kappa\) - decreases due to its inherent property. Thus, the nonincreasing nature of the power function is established.

Step by step solution

01

Understanding the Problem

The problem requires an understanding of the sign test and its power function. The power function of a statistical test is the probability that it will reject a false null hypothesis. In this problem, the hypotheses stated are \(H_{0}: \theta=\theta_{0}\) and \(H_{1}: \theta<\theta_{0}\). The task is to show that this power function is nonincreasing for the above hypotheses, i.e., if the actual parameter \(\theta\) is less than \(\theta_{0}\), the power of the test does not increase. The sign test is a non-parametric test that uses the signs of the differences between paired samples.
02

Setting up the Power Function of the Sign Test

To calculate the power function of the sign test, we need to consider positive and negative differences between the paired observations. Let's denote the positive differences as \(X_{+}\) and the negative differences as \(X_{-}\). The number of observations is \(n\). The sign test then uses the count of the signs, \(\#(+)\) and \(\#(-)\) to make a decision. Suppose the count of negative signs (\(X_{-}\)) is smaller or equal to some value \(\kappa\), sign test state to reject the null hypothesis. The power function of the sign test is the probability of obtaining such a result, under the distribution characterized by \(H_{1}\). Under the alternative hypothesis, \(H_{1}\), the negative signs are likelier than the positive ones. The PDF of the binomial distribution is used here, as the signs of differences in each pair of observations represent independent Bernoulli trials with a success probability \(p\), where \(p\) denotes the probability of a negative difference. The power function of the sign test is then given by the cumulative distribution function of binomial distribution under \(H_{1}\), evaluated at \(\kappa\).
03

Proving Nonincreasing Nature

To prove the nonincreasing nature of the power function, we need to show that if the actual parameter \(\theta\) is less than \(\theta_{0}\), the power of the test does not increase. If \(\theta<\theta_{0}\), that implies the probability of negative differences, \(p\), increases; hence, the likelihood of rejecting \(H_{0}\) increases because rejection zone is when count of negative signs (\(X_{-}\)) is smaller or equal to some value \(\kappa\). However, because we are using the cumulative distribution function of the binomial distribution to obtain the power function, as \(p\) increases, the power function actually decreases. The CDF of the binomial distribution is a non-decreasing function and it’s reversed (1 - CDF) is nonincreasing. It’s important to note that the value of \(\kappa\) remains constant. Hence, the value of the power function is nonincreasing.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let the scores \(a(i)\) be generated by \(a_{\varphi}(i)=\varphi[i /(n+1)]\), for \(i=1, \ldots, n\), where \(\int_{0}^{1} \varphi(u) d u=0\) and \(\int_{0}^{1} \varphi^{2}(u) d u=1 .\) Using Riemann sums, with subintervals of equal length, of the integrals \(\int_{0}^{1} \varphi(u) d u\) and \(\int_{0}^{1} \varphi^{2}(u) d u\), show that \(\sum_{i=1}^{n} a(i) \approx 0\) and \(\sum_{i=1}^{n} a^{2}(i) \approx n\).

Obtain the sensitivity curves for the sample mean, the sample median and the Hodges-Lehmann estimator for the following data set. Evaluate the curves at the values \(-300\) to 300 in increments of 10 and graph the curves on the same plot. Compare the sensitivity curves. $$ \begin{array}{rrrrrrrr} -9 & 58 & 12 & -1 & -37 & 0 & 11 & 21 \\ 18 & -24 & -4 & -53 & -9 & 9 & 8 & \end{array} $$ Note that the \(\mathrm{R}\) command wilcox.test \((\mathrm{x}\), conf . int \(=\mathrm{T}\) ) \$est computes the Hodges Lehmann estimate for the \(\mathrm{R}\) vector \(\mathrm{x}\).

Let \(X\) be a continuous random variable with cdf \(F(x)\). Suppose \(Y=X+\Delta\), where \(\Delta>0\). Show that \(Y\) is stochastically larger than \(X\).

Consider the location Model (10.3.35). Assume that the pdf of the random errors, \(f(x)\), is symmetric about \(0 .\) Let \(\widehat{\theta}\) be a location estimator of \(\theta\). Assume that \(E\left(\widehat{\theta}^{4}\right)\) exists. (a) Show that \(\widehat{\theta}\) is an unbiased estimator of \(\theta\). Hint: Assume without loss of generality that \(\theta=0 ;\) start with \(E(\hat{\theta})=\) \(E\left[\widehat{\theta}\left(X_{1}, \ldots, X_{n}\right)\right]\); and use the fact that \(X_{i}\) is symmetrically distributed about \(0 .\) (b) As in Section \(10.3 .4\), suppose we generate \(n_{s}\) independent samples of size \(n\) from the pdf \(f(x)\) which is symmetric about \(0 .\) For the \(i\) th sample, let \(\widehat{\theta}_{i}\) be the estimate of \(\theta\). Show that \(n_{s}^{-1} \sum_{i=1}^{n_{x}} \widehat{\theta}_{i}^{2} \rightarrow V(\hat{\theta})\), in probability.

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample that follows the location model (10.2.1). In this exercise we want to compare the sign tests and \(t\) -test of the hypotheses \((10.2 .2) ;\) so we assume the random errors \(\varepsilon_{i}\) are symmetrically distributed about \(0 .\) Let \(\sigma^{2}=\operatorname{Var}\left(\varepsilon_{i}\right) .\) Hence the mean and the median are the same for this location model. Assume, also, that \(\theta_{0}=0 .\) Consider the large sample version of the \(t\) -test, which rejects \(H_{0}\) in favor of \(H_{1}\) if \(\bar{X} /(\sigma / \sqrt{n})>z_{\alpha}\). (a) Obtain the power function, \(\gamma_{t}(\theta)\), of the large sample version of the \(t\) -test. (b) Show that \(\gamma_{t}(\theta)\) is nondecreasing in \(\theta\). (c) Show that \(\gamma_{t}\left(\theta_{n}\right) \rightarrow 1-\Phi\left(z_{\alpha}-\sigma \theta^{*}\right)\), under the sequence of local alternatives \((10.2 .13)\) (d) Based on part (c), obtain the sample size determination for the \(t\) -test to detect \(\theta^{*}\) with approximate power \(\gamma^{*}\). (e) Derive the \(\operatorname{ARE}(S, t)\) given in \((10.2 .27)\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free