Chapter 5: Q19E (page 163)
Entropy: Consider a distribution overpossible outcomes, with probabilities .
a. Just for this part of the problem, assume that each is a power of 2 (that is, of the form ). Suppose a long sequence of samples is drawn from the distribution and that for all , the outcome occurs exactly times in the sequence. Show that if Huffman encoding is applied to this sequence, the resulting encoding will have length
b. Now consider arbitrary distributions-that is, the probabilities are noy restricted to powers of 2. The most commonly used measure of the amount of randomness in the distribution is the entropy.
For what distribution (over outcomes) is the entropy the largest possible? The smallest possible?
Short Answer
- It can be proved that the length of the sequence is .
- is the largest entropy. is the smallest entropy.