Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(M_{1}\) and \(M_{2}\) be the martingales associated with the components of the multivariate counting process \(N=\left(N_{1}, N_{2}\right)\) with continuous compensators. Show that $$ \left\langle M_{1}, M_{2}\right\rangle=\left[M_{1}, M_{2}\right]=0. $$

Short Answer

Expert verified
Based on the solution above, the cross-covariation process or cross-bracket process, denoted as \(\langle M_{1}, M_{2}\rangle\) or \([M_{1}, M_{2}]\), for two martingales \(M_1\) and \(M_2\) associated with the components of a multivariate counting process \(N = (N_1, N_2)\) is equal to 0. This conclusion is reached after analyzing the infinitesimal increments of \(M_1\) and \(M_2\) and their cross-covariation/cross-bracket process, and then showing that each term of the expanded product results in a value of 0.

Step by step solution

01

Recall the definition of a martingale and the associated continuous compensators

Recall that a martingale is a stochastic process with the property that its expected value at any future time is equal to its current value, given its past values. Let \(A_{1}(t)\) and \(A_{2}(t)\) be the continuous compensators of \(N_{1}\) and \(N_{2}\), respectively. We have $$ M_{1}(t) = N_{1}(t) - A_{1}(t)\quad \text{ and }\quad M_{2}(t) = N_{2}(t) - A_{2}(t). $$
02

Analyze \(dM_{1}\) and \(dM_{2}\)

Consider the infinitesimal increment of \(M_1\) and \(M_2\): $$ dM_{1}(t) = dN_{1}(t) - dA_{1}(t)\quad \text{ and }\quad dM_{2}(t) = dN_{2}(t) - dA_{2}(t). $$
03

Determine the cross-covariation/cross-bracket process

The cross-covariation/cross-bracket process is defined as the following: $$ \langle M_{1}, M_{2}\rangle (t) = [M_{1}, M_{2}](t) = \sum_{s \leq t} dM_{1}(s)dM_{2}(s). $$ Substitute the expressions for \(dM_1(t)\) and \(dM_2(t)\) from Step 2 into the definition: $$ \langle M_{1}, M_{2}\rangle (t) = \sum_{s \leq t} \left( dN_{1}(s) - dA_{1}(s) \right) \left( dN_{2}(s) - dA_{2}(s) \right). $$
04

Show that \(\langle M_{1}, M_{2}\rangle = [M_{1}, M_{2}] = 0\)

Now, we expand the product and analyze each term: - The term \(\sum_{s \leq t} dN_{1}(s)dN_{2}(s)\) is equal to 0 since the increments of the counting processes \(N_1\) and \(N_2\) are disjoint. - The terms \(\sum_{s \leq t} dN_{1}(s)dA_{2}(s)\) and \(\sum_{s \leq t} dA_{1}(s)dN_{2}(s)\) are both 0 since the increments of the counting processes are uncorrelated with their compensators. - The term \(\sum_{s \leq t} dA_{1}(s)dA_{2}(s)\) is a non-decreasing function of time. Therefore, we have $$ \langle M_{1}, M_{2}\rangle (t) = [M_{1}, M_{2}](t) = 0, $$ which completes the proof.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

2Let \(M=N-\Lambda\) be the counting process local martingale. (a) Show that \(\mathrm{E} N(t)=\mathrm{E} \Lambda(t)\) (hint: use the monotone convergence theorem). (b) If \(\mathrm{E} \Lambda(t)<\infty\), then show that \(M\) is a martingale by verifying the martingale conditions. (c) If \(\sup _{t} \mathrm{E} \Lambda(t)<\infty\), then show that \(M\) is a square integrable martingale.

(Right-censoring by the same stochastic variable) Let \(T_{1}^{*}, \ldots, T_{n}^{*}\) be \(n\) i.i.d. positive stochastic variables with hazard function \(\alpha(t)\). The observed data consist of \(\left(T_{i}, \Delta_{i}\right)_{i=1, \ldots n}\), where \(T_{i}=T_{i}^{*} \wedge U, \Delta_{i}=I\left(T_{i}=T_{i}^{*}\right) .\) Here, \(U\) is a positive stochastic variable with hazard function \(\mu(t)\), and assumed independent of the \(T_{i}^{*}\) 's. Define $$ N \cdot(t)=\sum_{i=1}^{n} N_{i}(t), \quad Y \cdot(t)=\sum_{i=1}^{n} Y_{i}(t) $$ with \(N_{i}(t)=I\left(T_{i} \leq t, \Delta_{i}=1\right)\) and \(Y_{i}(t)=I\left(t \leq T_{i}\right), i=1, \ldots, n\). (a) Show that \(\hat{A}(t)-A^{*}(t)\) is a martingale, where $$ \hat{A}(t)=\int_{0}^{t} \frac{1}{Y \cdot(s)} d N \cdot(s), \quad A^{*}(t)=\int_{0}^{t} J(s) \alpha(s) d s . $$ (b) Show that $$ \sup _{s \leq t}\left|\hat{A}(s)-A^{*}(s)\right| \stackrel{P}{\rightarrow} 0 $$ if \(P\left(T_{i} \leq t\right)>0\). (c) Is it also true that \(\hat{A}(t)-A(t) \stackrel{P}{\rightarrow} 0 ?\)

(Asymptotic results for the Nelson-Aalen estimator) Let \(N^{(n)}(t)\) be a counting process satisfying the multiplicative intensity structure \(\lambda(t)=\) \(Y^{(n)}(t) \alpha(t)\) with \(\alpha(t)\) being locally integrable. The Nelson-Aalen estimator of \(\int_{0}^{t} \alpha(s) d s\) is $$ \hat{A}^{(n)}(t)=\int \frac{1}{Y^{(n)}(s)} d N^{(n)}(s) $$ Define \(A^{*}(t)=\int_{0}^{t} J^{(n)}(s) \alpha(s) d s\) where \(J^{(n)}(t)=I\left(Y^{(n)}(t)>0\right)\). (a) Show that \(A^{(n)}(t)-A^{*}(t)\) is a local square integrable martingale. (b) Show that, as \(n \rightarrow \infty\) $$ \sup _{s \leq t}\left|\hat{A}^{(n)}(t)-A(t)\right| \stackrel{P}{\rightarrow} 0 $$ provided that $$ \int_{0}^{t} \frac{J^{(n)}(s)}{Y^{(n)}(s)} \alpha(s) d s \stackrel{P}{\rightarrow} 0 \quad \text { and } \quad \int_{0}^{t}\left(1-J^{(n)}(s)\right) \alpha(s) d s \stackrel{P}{\rightarrow} 0 $$ as \(n \rightarrow \infty\). (c) Show that the two conditions given in (b) are satisfied provided that \(\inf _{s \leq t} Y^{(n)}(t) \stackrel{P}{\rightarrow} \infty, \quad\) as \(n \rightarrow \infty .\) efine \(\sigma^{2}(s)=\int_{0}^{s} \frac{\alpha(u)}{y(u)} d u\), where \(y\) is a non-negative function so that \(\alpha / y\) integrable over \([0, t]\). (d) Let \(n \rightarrow \infty\). If, for all \(\epsilon>0\), $$ \begin{gathered} n \int_{0}^{s} \frac{J^{(n)}(u)}{Y^{(n)}(u)} \alpha(u) I\left(\left|n^{1 / 2} \frac{J^{(n)}(u)}{Y^{(n)}(u)}\right|>\epsilon\right) d u \stackrel{P}{\rightarrow} 0, \\ n^{1 / 2} \int_{0}^{s}\left(1-J^{(n)}(u)\right) \alpha(u) d u \stackrel{P}{\rightarrow} 0 \text { and } n \int_{0}^{s} \frac{J^{(n)}(u)}{Y^{(n)}(u)} \alpha(u) d u \stackrel{P}{\rightarrow} \sigma^{2}(s) \end{gathered} $$ for all \(s \leq t\), then show that $$ n^{1 / 2}\left(\hat{A}^{(n)}-A\right) \stackrel{\mathcal{D}}{\rightarrow} U $$ on \(D[0, t]\), where \(U\) is a Gaussian martingale with variance function \(\sigma^{2}\).

(Counting process with discrete compensator) Let \(N\) be a counting process with compensator \(\Lambda\) that may have jumps. Put \(M=N-\Lambda\). (a) Show by a direct calculation that $$ [M](t)=N(t)-2 \int_{0}^{t} \Delta \Lambda(s) d N(s)+\int_{0}^{t} \Delta \Lambda(s) d \Lambda(s), $$ where \(\Delta \Lambda(t)\) denotes the jumps of \(\Lambda(t)\). (b) Show that $$ \langle M\rangle(t)=\Lambda(t)-\int_{0}^{t} \Delta \Lambda(s) d \Lambda(s) $$

Consider the time interval \([0, \tau]\). Let \(U(t)\) be a Gaussian martingale with covariance process \(V(t), t \in[0, \tau]\). Show that $$ U(t) V(\tau)^{1 / 2}[V(\tau)+V(t)]^{-1} $$ has the same distribution as $$ B^{0}\left(\frac{V(t)}{V(\tau)+V(t)}\right) $$ where \(B^{0}\) is the standard Brownian bridge.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free