Problem 1
Suppose that \(Y_{1}, \ldots, Y_{n}\) are independent Poisson variables with means \(\psi \pi_{j}\), where \(0<\) \(\pi_{j}<1\) and \(\sum \pi_{j}=1\). Find a marginal likelihood for \(\psi\) based on \(Y_{1}, \ldots, Y_{n}\), and show that no information about \(\psi\) is lost by using the marginal likelihood rather than the full likelihood.
Problem 1
Let \(Y\) and \(X\) be independent exponential variables with means \(1 /(\lambda+\psi)\) and \(1 / \lambda\). Find the distribution of \(Y\) given \(X+Y\) and show that when \(\psi=0\) it has mean \(s / 2\) and variance \(s^{2} / 12 .\) Construct an exact conditional test of the hypothesis \(\mathrm{E}(Y)=\mathrm{E}(X)\).
Problem 1
$$ \text { Let } Y_{1}, \ldots, Y_{n} \stackrel{\text { iid }}{\sim} N\left(\mu, c^{2} \mu^{2}\right) \text {, with } c \text { known. Show that } \bar{Y} / S \text { is ancillary for } \mu \text {. } $$
Problem 4
Let \(Y_{1}, \ldots, Y_{n} \stackrel{\mathrm{iid}}{\sim} N\left(\mu, \sigma^{2}\right)\) with \(\sigma^{2}\) known. Show that \(\left(Y_{1}-\bar{Y}, \ldots, Y_{n}-\bar{Y}\right)\) is distribution constant, and deduce that \(\bar{Y}\) and \(\sum\left(Y_{j}-\bar{Y}\right)^{2}\) are independent.
Problem 6
A Poisson variable \(Y\) has mean \(\mu\), which is itself a gamma random variable with mean \(\theta\) and shape parameter \(v\). Find the marginal density of \(Y\), and show that \(\operatorname{var}(Y)=\theta+\theta^{2} / v\), and that \(v\) and \(\theta\) are orthogonal. Hence show that \(v\) is orthogonal to \(\beta\) for any model in which \(\theta=\theta\left(x^{\mathrm{T}} \beta\right), x\) being a covariate vector. Is the same true for the model in which \(v=\theta / \kappa\), so that \(\operatorname{var}(Y)=(1+\kappa) \mu ?\) Discuss the implications for inference on \(\beta\) when the variance function is unknown.
Problem 7
Let \(X\) and \(Y\) be independent exponential variables with means \(\gamma^{-1}\) and \((\gamma \psi)^{-1}\). Show that the parameter \(\lambda(\gamma, \psi)\) orthogonal to \(\psi\) is the solution to the equation \(\partial \gamma / \partial \psi=-\gamma /(2 \psi)\), and verify that taking \(\lambda=\gamma / \psi^{-1 / 2}\) yields an orthogonal parametrization. Investigate how this solution changes when \(X\) and \(Y\) are subject to Type I censoring at \(c\).
Problem 11
Independent pairs of binary observations \(\left(R_{01}, R_{11}\right), \ldots,\left(R_{0 n}, R_{1 n}\right)\) have success probabilities \(\left(e^{\lambda_{j}} /\left(1+e^{\lambda_{j}}\right), e^{\psi+\lambda_{j}} /\left(1+e^{\psi+\lambda_{j}}\right)\right)\), for \(j=1, \ldots, n\) (a) Show that the maximum likelihood estimator of \(\psi\) based on the conditional likelihood is \(\widehat{\psi}_{\mathrm{c}}=\log \left(R^{01} / R^{10}\right)\), where \(R^{01}\) and \(R^{10}\) are respectively the numbers of \((0,1)\) and \((1,0)\) pairs. Does \(\widehat{\psi}_{\mathrm{c}}\) tend to \(\psi\) as \(n \rightarrow \infty\) ? (b) Write down the unconditional likelihood for \(\psi\) and \(\lambda\), and show that the likelihood equations are equivalent to $$ \begin{aligned} &r_{0 j}+r_{1 j}=\frac{e^{\hat{\lambda}_{j}}}{1+e^{\hat{\lambda}_{j}}}+\frac{e^{\hat{\lambda}_{j}+\widehat{\psi}}}{1+e^{\hat{\lambda}_{j}+\hat{\psi}}}, \quad j=1, \ldots, n \\ &\sum_{j=1}^{n} r_{1 j}=\sum_{j=1}^{n} \frac{e^{\hat{\lambda}_{j}+\hat{\psi}}}{1+e^{\hat{\lambda}_{j}+\widehat{\psi}}} \end{aligned} $$