Chapter 5: Problem 10
(Constant additive regression effects (Lin \& Ying, 1994)) Consider the multivariate counting process \(N=\left(N_{1}, \ldots N_{n}\right)^{T}\) so that \(N_{i}(t)\) has intensity $$ \lambda_{i}(t)=Y_{i}(t)\left(\alpha_{0}(t)+\beta^{T} Z_{i}\right) $$ where the covariates \(Z_{i}, i=1, \ldots, n\) ( \(p\)-vectors) are contained in the given filtration, \(Y_{i}(t)\) is the usual at risk indicator, \(\alpha_{0}(t)\) is locally integrable baseline intensity and \(\beta\) denotes a \(p\)-vector of regression parameters. Let \(M_{i}(t)=N_{i}(t)-\Lambda_{i}(t), i=1, \ldots, n\), where \(\Lambda_{i}(t)=\int_{0}^{t} \lambda_{i}(s) d s\). Define also \(N .(t)=\sum_{i=1}^{n} N_{i}(t), Y .(t)=\sum_{i=1}^{n} Y_{i}(t), \operatorname{og} A_{0}(t)=\int_{0}^{t} \alpha_{0}(s) d s\). The processes are observed in the interval \([0, \tau]\) with \(\tau<\infty\). (a) Suppose first that \(\beta\) is known. Explain why a natural estimator of \(A_{0}(t)\) is $$ \hat{A}_{0}(t, \beta)=\int_{0}^{t} \frac{1}{Y \cdot(s)} d N \cdot(s)-\sum_{i=1}^{n} \int_{0}^{t} \frac{Y_{i}(s)}{Y \cdot(s)} \beta^{T} Z_{i}(s) d s $$ where \(Y_{i}(t) / Y \cdot(t)\) is defined as 0 if \(Y \cdot(t)=0\). By mimicking the Cox-partial score function the following estimating function for estimation of \(\beta\) is obtained: $$ U(\beta)=\sum_{i=1}^{n} \int_{0}^{\tau} Z_{i}(t)\left(d N_{i}(t)-Y_{i}(t) d \hat{A}_{0}(t, \beta)-Y_{i}(t) \beta^{T} Z_{i}(t) d t\right) $$ in the case where \(\beta\) again is supposed unknown. (b) Show that the above estimating function can be written $$ U(\beta)=\sum_{i=1}^{n} \int_{0}^{\tau}\left(Z_{i}(t)-\bar{Z}(t)\right)\left(d N_{i}(t)-Y_{i}(t) \beta^{T} Z_{i}(t) d t\right) $$ where $$ \bar{Z}(t)=\sum_{i=1}^{n} Y_{i}(t) Z_{i}(t) / \sum_{i=1}^{n} Y_{i}(t) $$ (c) The value of \(\beta\) that satisfies \(U(\beta)=0\) is denoted \(\beta\). Show that $$ \hat{\beta}=\left(\sum_{i=1}^{n} \int_{0}^{\tau} Y_{i}(t)\left(Z_{i}(t)-\bar{Z}(t)\right)^{\otimes 2} d t\right)^{-1}\left(\sum_{i=1}^{n} \int_{0}^{\tau}\left(Z_{i}(t)-\bar{Z}(t)\right) d N_{i}(t)\right) $$ when \(\sum_{i=1}^{n} \int_{0}^{\tau} Y_{i}(t)\left(Z_{i}(t)-\bar{Z}(t)\right)^{\otimes 2} d t\) is assumed to be regular. (d) Show that \(U(\beta)\) can be written as \(U(\beta)=\tilde{M}(\tau)\), where $$ \tilde{M}(t)=\sum_{i=1}^{n} \int_{0}^{t}\left(Z_{i}(s)-\bar{Z}(s)\right) d M_{i}(s) $$ and that \(\tilde{M}(t)\) is a (local) square integrable martingale and find its predictable variation process. (e) Show, under suitable conditions, that \(n^{-\frac{1}{2}} U(\beta)\) converges in distribution towards a \(p\)-dimensional normal distribution for \(n \rightarrow \infty\), and identify the mean and variance. Show that the variance is estimated consistently by $$ B=\frac{1}{n} \sum_{i=1}^{n} \int_{0}^{\tau}\left(Z_{i}(t)-\bar{Z}(t)\right)^{\otimes 2} d N_{i}(t) $$
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.