Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

(Continuation of Example 3.1.4) (a) Show that the independent censoring condition of \(\mathrm{ABGK}\) (3.5) reduces to (3.8) in the case of right-censored survival data as described in the example. (b) Assume that \(T^{*}\) and \(U\) are conditionally independent given an explanatory variable \(X\), and that the distribution of \(T^{*}\) and \(U\) depends on \(X\). Show that the right-censoring induced by \(U\) is independent. (c) Assume that \(T^{*}\) and \(U\) are conditionally independent given \(X\), but that we never observe \(X\). So \(N^{*}(t)\) has intensity \(\lambda^{*}(t)\) with respect to \(\mathcal{F}_{t}^{*}=\mathcal{F}_{t}^{N^{*}} .\) Is the filtering of \(N^{*}(t)\) generated by \(U\) independent?

Short Answer

Expert verified
Answer: No, the filtering of N*(t) generated by U cannot be guaranteed to be independent without observing X. Additional information is required to establish independence.

Step by step solution

01

Part (a)

First, we recall that the independent censoring condition of \(\mathrm{ABGK} (3.5)\) is given by: \(E \left[ \frac{d N^{*}(t)}{Y(t)} | \mathcal{F}_{t-} \right] = E \left[ \frac{d N(t)}{Y(t)} | \mathcal{F}_{t-} \right].\) Now, for right-censored survival data, we have \(N^{*}(t) = N(t)1_{\{T \le U\}}\) and \(Y(t) = 1_{\{T^* > t\}}1_{\{U > t\}}\). Replacing these expressions in the independent censoring condition, we obtain: $$E \left[ \frac{d N(t)1_{\{T \le U\}}}{1_{\{T^* > t\}}1_{\{U > t\}}} | \mathcal{F}_{t-} \right] = E \left[ \frac{d N(t)}{1_{\{T^* > t\}}1_{\{U > t\}}} | \mathcal{F}_{t-} \right].$$ Comparing this with the formula \((3.8)\), we see that the independent censoring condition of \(\mathrm{ABGK}\) reduces to this condition in the case of right-censored survival data.
02

Part (b)

If \(T^{*}\) and \(U\) are conditionally independent given the explanatory variable \(X\), and the distribution of \(T^{*}\) and \(U\) depends on \(X\), then \(T^{*} \perp U | X\), and we can write their joint density as $$f_{T^*, U | X}(t^*, u | x) = f_{T^* | X}(t^* | x) \cdot f_{U | X}(u | x).$$ Now, suppose that \(T^{*}\) and \(U\) are both right-censored by time \(t\). Then, the right-censoring induced by \(U\) is independent if \(\frac{d N^{*}(t)}{1_{\{T^* > t\}}1_{\{U > t\}}}\) is independent of \(\frac{d N(t)}{1_{\{T^* > t\}}1_{\{U > t\}}}\). From the joint density formula, we can establish that \(\frac{d N^{*}(t)}{1_{\{T^* > t\}}1_{\{U > t\}}}\) and \(\frac{d N(t)}{1_{\{T^* > t\}}1_{\{U > t\}}}\) are indeed independent as they depend only on \(X\).
03

Part (c)

If \(T^{*}\) and \(U\) are conditionally independent given \(X\), then the intensity of \(N^{*}(t)\) has a form \(\lambda^{*}(t) = \lambda(t | X)\). If we do not observe \(X\), the filtering of \(N^{*}(t)\), denoted by \(\mathcal{F}_{t}^{*}\) with respect to \(\mathcal{F}_{t}^{N^{*}}\), cannot be determined precisely. However, if \(\lambda^{*}(t)\) can still be expressed solely as a function of \(Y(t)\), then the filtering of \(N^{*}(t)\) would still be generated by \(U\). Unfortunately, since we do not observe \(X\), this is not the case, and we conclude that the filtering of \(N^{*}(t)\) generated by \(U\) cannot be guaranteed to be independent without additional information.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

(Missing covariates) Assume that \(X_{1}\) and \(X_{2}\) are two covariates that take the values \(\\{0,1\\}\) and have joint distribution given by \(P\left(X_{1}=0 \mid X_{2}=\right.\) \(0)=2 / 3, P\left(X_{1}=0 \mid X_{2}=1\right)=1 / 3\) and \(P\left(X_{2}=1\right)=1 / 2 .\) Let \(\lambda(t)\) be a locally integrable non-negative function, and assume that the survival time \(T\) given \(X_{1}\) and \(X_{2}\) has hazard function $$ \lambda(t) \exp \left(0.1 X_{1}+0.3 X_{2}\right) $$ (a) Assume that only \(X_{1}\) is observed. What is the hazard function of \(T\) given \(X_{1}\) ? Similarly for \(X_{2}\). (b) Assume that \(\lambda(t)=\lambda\) and that i.i.d. survival data are obtained from the above generic model. Find the maximum likelihood estimator of \(\lambda\) and specify its asymptotic distribution. (c) Assume now that a right-censoring variable \(C\) is also present and that \(C\) given \(X_{1}\) has hazard function \(\lambda \exp \left(0.1 X_{1}\right)\). Assuming that only \(X_{1}\) is observed at time 0 specify how one should estimate the parameter of the survival model. (d) As in (c) but now assume that only \(X_{2}\) is observed.

Let \(\tilde{T}_{1}, \ldots, \tilde{T}_{n}\) be i.i.d. finite lifetimes with hazard function \(\alpha(t)\) Assume that \(\tilde{T}_{i}\) is right-censored at time \(U_{i}\), where $$ U_{1}=\infty, \quad U_{i}=U_{i-1} \wedge \tilde{T}_{i-1}, i \geq 2 . $$ We thus observe \(T_{i}=\tilde{T}_{i} \wedge U_{i}\) and \(\Delta_{i}=I\left(\tilde{T}_{i} \leq U_{i}\right), i=1, \ldots, n\). (a) Show that this censoring is independent. Let \(\tilde{T}_{(1)}=\tilde{T}_{1} \wedge \cdots \wedge \tilde{T}_{n} .\) (b) Compute the Nelson-Aalen estimator \(\hat{A}(t)\) for estimation of \(A(t)=\) \(\int_{0}^{t} \alpha(s) d s\) on the set where \(\tilde{T}_{(1)}=\tilde{T}_{1}\). (c) Show that \(\tilde{T}_{n}\) is observed if and only if \(\tilde{T}_{n}=\tilde{T}_{(1)}\). (d) Can the situation arise where all \(\tilde{T}_{1}, \ldots, \tilde{T}_{n}\) are observed? (e) Show that \(T_{1} \wedge \cdots T_{n}=\tilde{T}_{(1)}\) and that \(\hat{A}(t)\) always jumps at \(\tilde{T}_{(1)} .\) (f) Compute the jump size of \(\hat{A}(t)\) at \(\tilde{T}_{(1)}\).

(Failure intensity depending on censoring value) Let \(T^{*}\) be a failure time and put \(N^{*}(t)=I\left(T^{*} \leq t\right)\). Suppose that the filtering of \(N^{*}(t)\) is induced by \(C(t)=I(t \leq U)\), where \(U\) is a positive stochastic variable with density \(f .\) As usual we let \(T=T^{*} \wedge U\) denote the observed waiting time. Assume that $$ E\left(d N^{*}(t) \mid \mathcal{G}_{t-}^{*}\right)=I\left(t \leq T^{*}\right)\left(C(t) \alpha_{1}(t) d t+D(t) h(U) \alpha_{2}(t) d t\right) $$ where \(\mathcal{G}_{t}^{*}\) is defined by (3.4), \(\alpha_{1}(t)\) and \(\alpha_{2}(t)\) are to deterministic functions \(D(t)=1-C(t)\), and \(h\) is some function. (a) Compute the intensity of \(N^{*}\) with respect to \(\mathcal{F}_{t}^{*}\). Is the censoring independent according to the ABGK definition? (b) Compute the intensity of \(N\) with respect to \(\mathcal{F}_{t} .\) Is the censoring independent according to Definition 3.1.1? (c) Is the classification of the considered censoring depending on which definition that is used?

(Current status data with constant hazards) Let \(T^{*}\) denote a failure time with hazard function $$ \alpha(t)=\theta $$ where \(\theta\) is an unknown parameter. Let \(C\) denote a random monitoring time independent of \(T^{*}\) and with hazard function \(\mu(t)\). The observed data consist of \(\left(C, \Delta=I\left(C \leq T^{*}\right)\right)\). Such data are called current status data since at the monitoring time \(C\) it is only known whether or not the event of interest (with waiting time \(T^{*}\) ) has occurred. (a) Derive the intensity functions of the counting processes $$ N_{1}(t)=\Delta I(C \leq t), \quad N_{2}(t)=(1-\Delta) I(C \leq t) $$ [hint: Use the heuristic formula for the intensity given in Exercise 2.7]. Let \(\left(C_{i}, \Delta_{i}\right), i=1, \ldots, n\), be \(n\) independent replicates of \((C, \Delta=I(C \leq\) \(T)\). (b) Derive the likelihood function \(L_{t}\) for estimation of \(\theta\) when we observe over the interval \([0, t]\). Let \(U_{t}(\theta)\) denote the score function. Let further \(N_{j} \cdot(t)=\sum_{i} N_{j i}(t)\), where \(N_{j i}(t)\) is the \(i\) th realization of the above generic \(N_{j}(t), j=1,2\), corresponding to observing the \(i\) th subject. (c) Show that $$ U_{t}(\theta)=\int_{0}^{t} \frac{s e^{-\theta s}}{1-e^{-\theta s}} d N_{2} \cdot(s)-\int_{0}^{t} s N_{1} \cdot(s) $$ and that this is a martingale (considered as a process in \(t\) ). (d) Compute the predictable variation process \(\left\langle U_{t}(\theta)\right\rangle\). (e) Derive under suitable conditions the asymptotic distribution of the maximum likelihood estimator \(\hat{\theta}\) of \(\theta\), and give a consistent estimator of the asymptotic variance.

(Right-censoring: full likelihood function) Let \(\left(T_{i}, \Delta_{i}\right), i=1, \ldots, n\), be independent replicates of \((T, \Delta)\) described in Example \(3.1 .4\), and assume the distribution of \(U\) is absolute continuous with hazard function \(\mu(t)\). Define $$ N(t)=\sum_{i=1}^{n} I\left(T_{i} \leq t, \Delta_{i}=1\right) \quad \text { and } \quad Y(t)=\sum_{i=1}^{n} I\left(t \leq T_{i}\right) $$ (a) Show that the likelihood function based on observing \(\left(T_{i}, \Delta_{i}\right), i=\) \(1, \ldots, n\), can be written as $$ \prod_{i}\left\\{\alpha^{\theta}\left(T_{i}\right)^{\Delta_{i}} e^{-\int_{0}^{T_{i}} \alpha^{\theta}(t) d t}\right\\} \prod_{i}\left\\{\mu\left(T_{i}\right)^{1-\Delta_{i}} e^{-\int_{0}^{T_{i}} \mu(t) d t}\right\\} $$ (b) Show that the expression in (a) is proportional to the partial likelihood (3.17) defined from \(N\). (c) Assume that \(\mu(t)=\beta \alpha^{\theta}(t)\) (Koziol-Green model). Show that the censoring is now informative, but that the estimator, \(\hat{\theta}\), obtained by maximizing the partial likelihood defined from \(N\) is still consistent. Derive its asymptotical distribution. (d) Show, under the assumption of (c), that \(\Delta\) is ancillary for \(\theta\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free