Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

(Right-censoring: full likelihood function) Let \(\left(T_{i}, \Delta_{i}\right), i=1, \ldots, n\), be independent replicates of \((T, \Delta)\) described in Example \(3.1 .4\), and assume the distribution of \(U\) is absolute continuous with hazard function \(\mu(t)\). Define $$ N(t)=\sum_{i=1}^{n} I\left(T_{i} \leq t, \Delta_{i}=1\right) \quad \text { and } \quad Y(t)=\sum_{i=1}^{n} I\left(t \leq T_{i}\right) $$ (a) Show that the likelihood function based on observing \(\left(T_{i}, \Delta_{i}\right), i=\) \(1, \ldots, n\), can be written as $$ \prod_{i}\left\\{\alpha^{\theta}\left(T_{i}\right)^{\Delta_{i}} e^{-\int_{0}^{T_{i}} \alpha^{\theta}(t) d t}\right\\} \prod_{i}\left\\{\mu\left(T_{i}\right)^{1-\Delta_{i}} e^{-\int_{0}^{T_{i}} \mu(t) d t}\right\\} $$ (b) Show that the expression in (a) is proportional to the partial likelihood (3.17) defined from \(N\). (c) Assume that \(\mu(t)=\beta \alpha^{\theta}(t)\) (Koziol-Green model). Show that the censoring is now informative, but that the estimator, \(\hat{\theta}\), obtained by maximizing the partial likelihood defined from \(N\) is still consistent. Derive its asymptotical distribution. (d) Show, under the assumption of (c), that \(\Delta\) is ancillary for \(\theta\).

Short Answer

Expert verified
Based on the given solution, show that the likelihood can be written in the specified form by following these steps: 1. Write down the joint probability for (Ti, Δi): $$ f_{T_i,\Delta_i}(t, \delta) = \left\{\begin{array}{ll} \alpha^{\theta}(t)\mu(t)e^{-\int_{0}^{t} \alpha^{\theta}(s)\mu(s)ds}, & \text{ if } \delta = 1 \\ (1-\alpha^{\theta}(t)\mu(t))e^{-\int_{0}^{t} \alpha^{\theta}(s)\mu(s)ds}, & \text{ if } \delta = 0 \end{array}\right. $$ 2. Calculate likelihood for independent observations: $$ L(\theta) = \prod_{i=1}^n f_{T_i,\Delta_i}(T_i, \Delta_i) $$ 3. Transform the likelihood function: $$ L(\theta) = \prod_{i}\left\{\alpha^{\theta}\left(T_{i}\right)^{\Delta_{i}} e^{-\int_{0}^{T_{i}} \alpha^{\theta}(t) d t}\right\} \prod_{i}\left\{\mu\left(T_{i}\right)^{1-\Delta_{i}} e^{-\int_{0}^{T_{i}} \mu(t) d t}\right\} $$

Step by step solution

01

$$ f_{T_i,\Delta_i}(t, \delta) = \left\{\begin{array}{ll} \alpha^{\theta}(t)\mu(t)e^{-\int_{0}^{t} \alpha^{\theta}(s)\mu(s)ds}, & \text{ if } \delta = 1 \\ (1-\alpha^{\theta}(t)\mu(t))e^{-\int_{0}^{t} \alpha^{\theta}(s)\mu(s)ds}, & \text{ if } \delta = 0 \end{array}\right. $$ #Step 2: Calculate likelihood for independent observations# Since the observations are given to be independent, the likelihood function can be formulated as the product of the joint probability densities for all observations.

$$ L(\theta) = \prod_{i=1}^n f_{T_i,\Delta_i}(T_i, \Delta_i) $$ #Step 3: Transform the likelihood function# We now transform the likelihood function using the joint pdf from Step 1 and the given relationships for the dataset \((T_i, \Delta_i)\). After some simplification, we arrive at the desired result.
02

$$ L(\theta) = \prod_{i=1}^n \left\{ \begin{array}{ll} \alpha^{\theta}(T_i)\mu(T_i)e^{-\int_{0}^{T_i} \alpha^{\theta}(s)\mu(s)ds}, & \text{ if } \Delta_i = 1 \\ (1-\alpha^{\theta}(T_i)\mu(T_i))e^{-\int_{0}^{T_i} \alpha^{\theta}(s)\mu(s)ds}, & \text{ if } \Delta_i = 0 \end{array}\right. $$

$$ L(\theta) = \prod_{i}\left\{\alpha^{\theta}\left(T_{i}\right)^{\Delta_{i}} e^{-\int_{0}^{T_{i}} \alpha^{\theta}(t) d t}\right\} \prod_{i}\left\{\mu\left(T_{i}\right)^{1-\Delta_{i}} e^{-\int_{0}^{T_{i}} \mu(t) d t}\right\} $$

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

(Failure intensity depending on censoring value) Let \(T^{*}\) be a failure time and put \(N^{*}(t)=I\left(T^{*} \leq t\right)\). Suppose that the filtering of \(N^{*}(t)\) is induced by \(C(t)=I(t \leq U)\), where \(U\) is a positive stochastic variable with density \(f .\) As usual we let \(T=T^{*} \wedge U\) denote the observed waiting time. Assume that $$ E\left(d N^{*}(t) \mid \mathcal{G}_{t-}^{*}\right)=I\left(t \leq T^{*}\right)\left(C(t) \alpha_{1}(t) d t+D(t) h(U) \alpha_{2}(t) d t\right) $$ where \(\mathcal{G}_{t}^{*}\) is defined by (3.4), \(\alpha_{1}(t)\) and \(\alpha_{2}(t)\) are to deterministic functions \(D(t)=1-C(t)\), and \(h\) is some function. (a) Compute the intensity of \(N^{*}\) with respect to \(\mathcal{F}_{t}^{*}\). Is the censoring independent according to the ABGK definition? (b) Compute the intensity of \(N\) with respect to \(\mathcal{F}_{t} .\) Is the censoring independent according to Definition 3.1.1? (c) Is the classification of the considered censoring depending on which definition that is used?

(Left-truncated survival time) Let the survival time \(T^{*}\) be left-truncated by the random \(V\) and consider the setup described in Example 3.1.2. (a) Show that this filtering is independent if the conditional density (assumed to exist) of \(\left(T^{*}, V\right)\) given \(T^{*}>V\) may be written as \(f\left(t^{*}\right) g(v)\) for \(t^{*}>v\). Assume from now on that \(T^{*}\) and \(V\) are independent or that the condition in (a) holds. (b) Let \(\mathcal{F}_{t}=\sigma(I(V \leq s), I(VV): V \leq s \leq V+t) $$ Show that \(N(t)\) has compensator \(\Lambda(t)\) with respect to \(\mathcal{F}_{t}\) when computed under \(P\) or \(P_{\mathcal{O}}\).

Let \(\tilde{T}_{1}, \ldots, \tilde{T}_{n}\) be i.i.d. finite lifetimes with hazard function \(\alpha(t)\) Assume that \(\tilde{T}_{i}\) is right-censored at time \(U_{i}\), where $$ U_{1}=\infty, \quad U_{i}=U_{i-1} \wedge \tilde{T}_{i-1}, i \geq 2 . $$ We thus observe \(T_{i}=\tilde{T}_{i} \wedge U_{i}\) and \(\Delta_{i}=I\left(\tilde{T}_{i} \leq U_{i}\right), i=1, \ldots, n\). (a) Show that this censoring is independent. Let \(\tilde{T}_{(1)}=\tilde{T}_{1} \wedge \cdots \wedge \tilde{T}_{n} .\) (b) Compute the Nelson-Aalen estimator \(\hat{A}(t)\) for estimation of \(A(t)=\) \(\int_{0}^{t} \alpha(s) d s\) on the set where \(\tilde{T}_{(1)}=\tilde{T}_{1}\). (c) Show that \(\tilde{T}_{n}\) is observed if and only if \(\tilde{T}_{n}=\tilde{T}_{(1)}\). (d) Can the situation arise where all \(\tilde{T}_{1}, \ldots, \tilde{T}_{n}\) are observed? (e) Show that \(T_{1} \wedge \cdots T_{n}=\tilde{T}_{(1)}\) and that \(\hat{A}(t)\) always jumps at \(\tilde{T}_{(1)} .\) (f) Compute the jump size of \(\hat{A}(t)\) at \(\tilde{T}_{(1)}\).

(Missing covariates) Assume that \(X_{1}\) and \(X_{2}\) are two covariates that take the values \(\\{0,1\\}\) and have joint distribution given by \(P\left(X_{1}=0 \mid X_{2}=\right.\) \(0)=2 / 3, P\left(X_{1}=0 \mid X_{2}=1\right)=1 / 3\) and \(P\left(X_{2}=1\right)=1 / 2 .\) Let \(\lambda(t)\) be a locally integrable non-negative function, and assume that the survival time \(T\) given \(X_{1}\) and \(X_{2}\) has hazard function $$ \lambda(t) \exp \left(0.1 X_{1}+0.3 X_{2}\right) $$ (a) Assume that only \(X_{1}\) is observed. What is the hazard function of \(T\) given \(X_{1}\) ? Similarly for \(X_{2}\). (b) Assume that \(\lambda(t)=\lambda\) and that i.i.d. survival data are obtained from the above generic model. Find the maximum likelihood estimator of \(\lambda\) and specify its asymptotic distribution. (c) Assume now that a right-censoring variable \(C\) is also present and that \(C\) given \(X_{1}\) has hazard function \(\lambda \exp \left(0.1 X_{1}\right)\). Assuming that only \(X_{1}\) is observed at time 0 specify how one should estimate the parameter of the survival model. (d) As in (c) but now assume that only \(X_{2}\) is observed.

(Current status data with constant hazards) Let \(T^{*}\) denote a failure time with hazard function $$ \alpha(t)=\theta $$ where \(\theta\) is an unknown parameter. Let \(C\) denote a random monitoring time independent of \(T^{*}\) and with hazard function \(\mu(t)\). The observed data consist of \(\left(C, \Delta=I\left(C \leq T^{*}\right)\right)\). Such data are called current status data since at the monitoring time \(C\) it is only known whether or not the event of interest (with waiting time \(T^{*}\) ) has occurred. (a) Derive the intensity functions of the counting processes $$ N_{1}(t)=\Delta I(C \leq t), \quad N_{2}(t)=(1-\Delta) I(C \leq t) $$ [hint: Use the heuristic formula for the intensity given in Exercise 2.7]. Let \(\left(C_{i}, \Delta_{i}\right), i=1, \ldots, n\), be \(n\) independent replicates of \((C, \Delta=I(C \leq\) \(T)\). (b) Derive the likelihood function \(L_{t}\) for estimation of \(\theta\) when we observe over the interval \([0, t]\). Let \(U_{t}(\theta)\) denote the score function. Let further \(N_{j} \cdot(t)=\sum_{i} N_{j i}(t)\), where \(N_{j i}(t)\) is the \(i\) th realization of the above generic \(N_{j}(t), j=1,2\), corresponding to observing the \(i\) th subject. (c) Show that $$ U_{t}(\theta)=\int_{0}^{t} \frac{s e^{-\theta s}}{1-e^{-\theta s}} d N_{2} \cdot(s)-\int_{0}^{t} s N_{1} \cdot(s) $$ and that this is a martingale (considered as a process in \(t\) ). (d) Compute the predictable variation process \(\left\langle U_{t}(\theta)\right\rangle\). (e) Derive under suitable conditions the asymptotic distribution of the maximum likelihood estimator \(\hat{\theta}\) of \(\theta\), and give a consistent estimator of the asymptotic variance.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free