Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Entropy: Consider a distribution overnpossible outcomes, with probabilities p1,p2,K,pn.

a. Just for this part of the problem, assume that each piis a power of 2 (that is, of the form 1/2k). Suppose a long sequence of msamples is drawn from the distribution and that for all 1in, the ithoutcome occurs exactly times in the sequence. Show that if Huffman encoding is applied to this sequence, the resulting encoding will have length

i-1nmpilog1pi

b. Now consider arbitrary distributions-that is, the probabilities pi are noy restricted to powers of 2. The most commonly used measure of the amount of randomness in the distribution is the entropy.

i-1nmpilog1pi

For what distribution (over outcomes) is the entropy the largest possible? The smallest possible?

Short Answer

Expert verified
  1. It can be proved that the length of the sequence is i-1nmpilog1pi.
  2. pi=1n is the largest entropy. pk=1,pi,ik=0is the smallest entropy.

Step by step solution

01

Explain the information given

Consider a distribution over possible outcomes with probabilities p1,p2,Kpn. Assume that each piis a power of 2. Consider the long sequence of msamples is drawn from the distribution and for all 1in, iththe outcome occurs exactly mpitimes in the sequence.

02

Step 2:Show the length of the sequence

(a)

Consider the encoded length of the element piis with the probability log1pi. The probability of 12kis 1. The ckis at the level kin Huffman tree (assuming that the root is at level 0) and 12>25, and for all role="math" localid="1659009731170" 12n<13,n>1. Since the outcome is c12 and the probability of all the elements in the first layer is 1.

Consider the root to be 1, the subtree must be at the first level and this level elements have the probability 14.Then the probability ck, results in the Huffman code is reduced to the length role="math" localid="1659010017242" i=1nmpilog1pi.

Therefore, the length of the sequence is role="math" localid="1659010031265" i=1nmpilog1pi.

03

Step 3:Calculate the entropy of largest possible and the smallest possible.

(b)

pi=1n is the largest entropy. pk=1,pi,ik=0is the smallest entropy.

Therefore, the largest and the smallest possible entropies are obtained.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Study anywhere. Anytime. Across all devices.

Sign-up for free