Hw do you justify the fact that hz lt hx hy under what


Two discrete memory less information sources X and Y each have an alphabet with six symbols, X = Y = {1, 2, 3, 4, 5, 6}. The probabilities of the letters for X are 1/2, 1/4, 1/8, 1/16, 1/32, and 1/32. The source Y has a uniform distribution.

1. Which source is less predictable and why?

2. Design Huffman codes for each source. Which Huffman code is more efficient? (Efficiency of a Huffman code is defined as the ratio of the source entropy to the average codeword length.)

3. If Huffman codes were designed for the second extension of these sources (i.e., two letters at a time), for which source would you expect a performance improvement compared to the single-letter Huffman code and why?

4. Now assume the two sources are independent and a new source Z is defined to be the sum of the two sources, i.e., Z = X + Y . Determine the entropy of this source, and verify that H(Z)

5. How do you justify the fact that H(Z) < H(X) + H(Y )? Under what circumstances can you have H(Z) = H(X) + H(Y )? Is there a case where you can have H(Z) > H(X) + H(Y )? Why?

Request for Solution File

Ask an Expert for Answer!!
English: Hw do you justify the fact that hz lt hx hy under what
Reference No:- TGS01729125

Expected delivery within 24 Hours