Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Show that no covariance stationary process \(\left\\{X_{n}\right\\}\) can satisfy the stochastic, difference equation \(X_{n}=X_{n-1}+\varepsilon_{n}\), when \(\left\\{\varepsilon_{n}\right\\}\) is a sequence of zero-mean uncorrelated random variables having a common positive variance \(\sigma^{2}>0\).

Short Answer

Expert verified
The given stochastic difference equation \(X_{n}=X_{n-1}+\varepsilon_{n}\) contradicts the basic property of a covariance stationary process. As a result, no covariance stationary process can satisfy this equation.

Step by step solution

01

Understanding Covariance Stationary Process

Covariance stationary process is a type of stochastic process in which the mean and variance are constant over time, and the covariance between two time periods depends only on the distance or gap or lag between the two time periods and not the actual time at which the covariance is computed. In mathematical terms, a process \(\{X_{n}\}\) is covariance stationary if it satisfies the following conditions: 1) \(E[X_{n}]\) = \(\mu\) is independent of n. 2) \(Var[X_{n}]\) = \(\sigma^{2}\) is independent of n. 3) \(Cov[X_{n}, X_{n+k}]\) = \(\gamma_k\) is dependent only on k but not on n.
02

Analyzing the Given Stochastic Difference Equation

According to the problem, we have a stochastic difference equation as \(X_{n} = X_{n - 1} + \varepsilon_{n}\). If \(\varepsilon_{n}\) is a zero-mean uncorrelated and common variance \(\sigma^{2}>0\) sequence, then for \(n \geq 1\), one can find that \(X_{n}\) = \(X_{0}\) + \(\varepsilon_{1}\) + \(\varepsilon_{2}\) + ... + \(\varepsilon_{n}\). This sequence of random variables, \(\varepsilon_{n}\), results in \(X_{n}\) being a sum of these random variables, which makes it dependent on the number of variables (or the value of n).
03

Showing the contradiction

Now, the expected value of \(X_{n}\) is \(E[X_{n}]\) = \(E[X_{0}]\) + n * \(E[\varepsilon]\) = \(E[X_{0}]\) as \(E[\varepsilon]\) = 0. However, this contradicts the covariance stationary process property as the mean \(E[X_{n}]\) should be independent of n. But here, for different n, \(X_{n}\) turns out to be different. Therefore, no covariance stationary process can satisfy the given stochastic difference equation.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(\\{B(t) ; 0 \leq t \leq 1\\}\) be a standard Brownian motion process and let \(B(I)=B(t)-B(s)\), for \(I=(s, t], 0 \leq s \leq t \leq 1\) be the aseociated Gaussian random measure. Validate the identity $$ E\left[\exp \left\\{\lambda \int_{0}^{1} f(s) d B(s)\right\\}\right]=\exp \left\\{\frac{1}{2} \lambda^{2} \int_{0}^{1} f^{2}(s) d s\right\\},-\infty<\lambda<\infty $$ where \(f(s), 0 \leq s \leq 1\) is a continuous funetion.

Show that a predictor $$ \hat{X}_{n}=\alpha_{1} X_{n-1}+\cdots+\alpha_{p} X_{n-p} $$ is optimal among all linear predictors of \(X_{n}\) given \(X_{n-1}, \ldots, X_{n-p}\) if and only if $$ 0=\int_{-\pi}^{n} e^{i k \lambda}\left[1-\sum_{t=1}^{R} \alpha_{1} e^{-i t}\right] d F(\lambda), \quad k=1, \ldots, p $$ where \(F(\omega),-\pi \leq \omega \leq \pi\), is the spectral distribution function of the covariance stationary process \(\left\\{X_{n}\right\\} .\)

A stochastie process \(\left\\{X_{n}\right\\}\) is said to be weakly mixing if, for all sets \(A, B\) of real sequences \(\left(x_{1}, x_{2}, \ldots\right)\), $$ \begin{aligned} &\lim _{n \rightarrow \infty} \frac{1}{n} \sum_{k=1}^{n} \operatorname{Pr}\left\\{\left(X_{1}, X_{2}, \ldots\right) \in A \quad \text { and }\left(X_{k}, X_{k+1}, \ldots\right) \in B\right\\} \\ &=\operatorname{Pr}\left\\{\left(X_{1}, X_{2}, \ldots\right) \in A\right\\} \times \operatorname{Pr}\left\\{\left(X_{1}, X_{2}, \ldots\right) \in B\right\\} \end{aligned} $$ Show that every weakly mixing process is ergodic. Remark: To verify weakly mixing, it suffices to show, for every \(m=1,2, \ldots\), and all sets \(A, B\) of vectors \(\left(x_{1}, \ldots, x_{m}\right)\), that $$ \begin{aligned} &\lim _{n \rightarrow \infty} \frac{1}{n} \sum_{k=1}^{n} \operatorname{Pr}\left\\{\left(X_{1}, \ldots, X_{m}\right) \in A \quad \text { and }\left(X_{k+1}, \ldots, X_{k+m}\right) \in B\right\\} \\ &=\operatorname{Pr}\left(\left(X_{1}, \ldots, X_{m}\right) \in A\right\\} \times \operatorname{Pr}\left\\{\left(X_{1}, \ldots, X_{m}\right) \in B\right\\} \end{aligned} $$

Let \(\left\\{X_{n}\right\\}\) be a zero-mean covariance stationary process having positive spectral density function \(f(\omega)\) and variance \(\sigma_{x}^{2}=1 .\) Kolmogorov's formula states $$ \sigma_{e}^{2}=\exp \left(\frac{1}{2 \pi} \int_{-\pi}^{\pi} \log 2 \pi f(\omega) d \rho\right\\} $$ where \(\sigma_{e}^{2}=\inf E\left[\left|\hat{X}_{n}-X_{n}\right|^{2}\right]\) is the minimum mean square linear prediction error of \(X_{n}\) given the past. Verify Kolmogorov's formula when $$ R(v)=\gamma^{l v \mid}, \quad v=0, \pm 1, \ldots $$ with \(|\gamma|<1\)

Let \(\left\\{X_{n}\right\\}_{n=-\infty}^{+\infty}\) be a zero-mean covariance stationary process having covariance function \(R(v)=\gamma^{|0|}, v=0, \pm 1, \ldots\), where \(|\gamma|<1 .\) Find the minimum mean square error linear predictor of \(X_{n+1}\) given the entire past \(X_{n}, X_{n-1} \ldots\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free