Ratio of entropy to the average codeword length


Response to the following problem:

Two discrete memory less information sources S1 and S2 each have an alphabet with six symbols, S1 = {x1, x2,..., x6} and S2 = {y1, y2,..., y6}. The probabilities of the letters for the first source are 1/2, 1/4, 1/8, 1/16, 1/32, and 1/32. The second source has a uniform distribution.

1. Which source is less predictable and why?

2. Design Huffman codes for each source. Which Huffman code is more efficient? (Efficiency of a Huffman code is defined as the ratio of the source entropy to the average codeword length.)

3. If Huffman codes were designed for the second extension of these sources (i.e., two letters at a time), for which source would you expect a performance improvement compared to the single-letter Huffman code and why?

 

Request for Solution File

Ask an Expert for Answer!!
Other Engineering: Ratio of entropy to the average codeword length
Reference No:- TGS02038774

Expected delivery within 24 Hours