Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with mean \(\theta, 0<\theta<\infty\). Let \(Y=\sum_{1}^{n} X_{i} .\) Use the loss function \(\mathcal{L}[\theta, \delta(y)]=\) \([\theta-\delta(y)]^{2}\). Let \(\theta\) be an observed value of the random variable \(\Theta\). If \(\Theta\) has the prior \(\operatorname{pdf} h(\theta)=\theta^{\alpha-1} e^{-\theta / \beta} / \Gamma(\alpha) \beta^{\alpha}\), for \(0<\theta<\infty\), zero elsewhere, where \(\alpha>0, \beta>0\) are known numbers, find the Bayes solution \(\delta(y)\) for a point estimate for \(\theta\).

Short Answer

Expert verified
The Bayes solution for a point estimate for \(\theta\) is \(\delta(y) = \frac{(\sum xi +\alpha)}{(1/\beta +n)}\).

Step by step solution

01

Identify Prior and likelihood

We observe that the prior \(h(\theta)\) follows the gamma distribution of parameters \(\alpha, \beta\).The likelihood is given by \[L(\theta:y)=\frac{e^{-n\theta}\theta^{\sum xi}}{(\sum xi)!}\] which is the Poisson distribution.
02

Compute the Posterior distribution

Multiplying prior and likelihood, we drop some multiplicative constants not involving \(\theta\) and compute the posterior distribution:\[\pi(\theta:y) \propto L(\theta:y)h(\theta)= \theta^{(\sum xi)} e^{-n\theta}\theta^{(\alpha-1)}e^{-\theta/\beta}=\theta^{(\sum xi + \alpha -1)}e^{-\theta(1/\beta + n )}\] This is a gamma distribution with parameters \((\sum xi + \alpha)\) and \((1/\beta + n)\)
03

Calculate Bayes Estimator

The Bayes estimator minimizes the posterior expected loss. Here, we have a square error loss, so the Bayes estimator of \(\theta\) is the posterior expected value of \(\theta\), which in the case of a gamma distribution is:\[\delta(y) = E[\theta|Y=y]=\frac{(\sum xi +\alpha)}{(1/\beta +n)}\]

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(Y_{4}\) be the largest order statistic of a sample of size \(n=4\) from a distribution with uniform pdf \(f(x ; \theta)=1 / \theta, 0

In Example 11.1.2, let \(n=30, \alpha=10\), and \(\beta=5\), so that \(\delta(y)=(10+y) / 45\) is the Bayes estimate of \(\theta\). (a) If \(Y\) has a binomial distribution \(b(30, \theta)\), compute the risk \(E\left\\{[\theta-\delta(Y)]^{2}\right\\}\). (b) Find values of \(\theta\) for which the risk of part (a) is less than \(\theta(1-\theta) / 30\), the risk associated with the maximum likelihood estimator \(Y / n\) of \(\theta\).

Let \(\mathbf{X}_{1}, \mathbf{X}_{2}, \ldots, \mathbf{X}_{n}\) be a random sample from a multivariate normal normal distribution with mean vector \(\boldsymbol{\mu}=\left(\mu_{1}, \mu_{2}, \ldots, \mu_{k}\right)^{\prime}\) and known positive definite covariance matrix \(\mathbf{\Sigma}\). Let \(\overline{\mathbf{X}}\) be the mean vector of the random sample. Suppose that \(\mu\) has a prior multivariate normal distribution with mean \(\boldsymbol{\mu}_{0}\) and positive definite covariance matrix \(\boldsymbol{\Sigma}_{0}\). Find the posterior distribution of \(\mu\), given \(\overline{\mathbf{X}}=\overline{\mathbf{x}}\). Then find the Bayes estimate \(E(\boldsymbol{\mu} \mid \overline{\mathbf{X}}=\overline{\mathbf{x}})\).

Consider the Bayes model \(X_{i} \mid \theta, i=1,2, \ldots, n \sim\) iid with distribution \(b(1, \theta), 0<\theta<1\) (a) Obtain the Jeffreys' prior for this model. (b) Assume squared-error loss and obtain the Bayes estimate of \(\theta\).

Consider the following mixed discrete-continuous pdf for a random vector \((X, Y)\) (discussed in Casella and George, 1992): $$ f(x, y) \propto\left\\{\begin{array}{ll} \left(\begin{array}{l} n \\ x \end{array}\right) y^{x+\alpha-1}(1-y)^{n-x+\beta-1} & x=0,1, \ldots, n, 00\) and \(\beta>0\). (a) Show that this function is indeed a joint, mixed discrete-continuous pdf by finding the proper constant of proportionality. (b) Determine the conditional pdfs \(f(x \mid y)\) and \(f(y \mid x)\). (c) Write the Gibbs sampler algorithm to generate random samples on \(X\) and \(Y\). (d) Determine the marginal distributions of \(X\) and \(Y\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free