Chapter 11: Problem 5
Let \(Y_{n}\) be the \(n\) th order statistic of a random sample of size \(n\) from
a distribution with pdf \(f(x \mid \theta)=1 / \theta, 0
Chapter 11: Problem 5
Let \(Y_{n}\) be the \(n\) th order statistic of a random sample of size \(n\) from
a distribution with pdf \(f(x \mid \theta)=1 / \theta, 0
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(Y_{4}\) be the largest order statistic of a sample of size \(n=4\) from a
distribution with uniform pdf \(f(x ; \theta)=1 / \theta, 0
Let \(Y\) have a binomial distribution in which \(n=20\) and \(p=\theta\). The prior probabilities on \(\theta\) are \(P(\theta=0.3)=2 / 3\) and \(P(\theta=0.5)=1 / 3\). If \(y=9\), what are the posterior probabilities for \(\theta=0.3\) and \(\theta=0.5\) ?
Let \(f(x \mid \theta), \theta \in \Omega\), be a pdf with Fisher information, \((6.2 .4), I(\theta)\). Consider the Bayes model $$ \begin{aligned} X \mid \theta & \sim f(x \mid \theta), \quad \theta \in \Omega \\ \Theta & \sim h(\theta) \propto \sqrt{I(\theta)} \end{aligned} $$ (a) Suppose we are interested in a parameter \(\tau=u(\theta)\). Use the chain rule to prove that $$ \sqrt{I(\tau)}=\sqrt{I(\theta)}\left|\frac{\partial \theta}{\partial \tau}\right| . $$ (b) Show that for the Bayes model (11.2.2), the prior pdf for \(\tau\) is proportional to \(\sqrt{I(\tau)}\) The class of priors given by expression (11.2.2) is often called the class of Jeffreys' priors; see Jeffreys (1961). This exercise shows that Jeffreys' priors exhibit an invariance in that the prior of a parameter \(\tau\), which is a function of \(\theta\), is also proportional to the square root of the information for \(\tau\).
Example 11.4.1 dealt with a hierarchical Bayes model for a conjugate family of normal distributions. Express that model as $$ \begin{aligned} &\bar{X} \mid \Theta \sim N\left(\theta, \frac{\sigma^{2}}{n}\right), \sigma^{2} \text { is known } \\ &\Theta \mid \tau^{2} \quad \sim N\left(0, \tau^{2}\right) \end{aligned} $$ Obtain the empirical Bayes estimator of \(\theta\).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with mean \(\theta, 0<\theta<\infty\). Let \(Y=\sum_{1}^{n} X_{i} .\) Use the loss function \(\mathcal{L}[\theta, \delta(y)]=\) \([\theta-\delta(y)]^{2}\). Let \(\theta\) be an observed value of the random variable \(\Theta\). If \(\Theta\) has the prior \(\operatorname{pdf} h(\theta)=\theta^{\alpha-1} e^{-\theta / \beta} / \Gamma(\alpha) \beta^{\alpha}\), for \(0<\theta<\infty\), zero elsewhere, where \(\alpha>0, \beta>0\) are known numbers, find the Bayes solution \(\delta(y)\) for a point estimate for \(\theta\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.