Chapter 10: Problem 16
(Shannon's Theorem) Let \(\left\\{X_{i}\right\\}\) be a sequence of independent random variables on the sample space \(\Omega\) having the common distribution \(\lambda=\sum_{1}^{r} p_{j} \delta_{j}\) where \(0<\) \(p_{j}<1, \sum_{1}^{r} p_{j}=1\), and \(\delta_{j}\) is the point mass at \(j .\) Define random variables \(Y_{1}, Y_{2}, \ldots\) on \(\Omega\) by $$ Y_{n}(\omega)=P\left(\left\\{\omega^{\prime}: X_{i}\left(\omega^{\prime}\right)=X_{i}(\omega) \text { for } 1 \leq i \leq n\right\\}\right) . $$ a. \(Y_{n}=\prod_{1}^{n} p_{X_{i}} .\) (The notation is peculiar but correct: \(X_{i}(\cdot) \in\\{1, \ldots, r\\}\) a.s., so \(p X_{i}\) is well-defined a.s.) b. \(\lim _{n \rightarrow \infty} n^{-1} \log Y_{n}=\sum_{1}^{r} p_{j} \log p_{j}\) almost surely. (In information theory, the \(X_{i}\) 's are considered as the output of a source of digital signals, and \(-\sum_{i}^{r} p_{j} \log p_{j}\) is called the entropy of the signal.)
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.