Determinining entropies and shanon fano code for messages


Question 1)a) What do you mean by information? What are its units? How does it relate to the entropy?

b) Suppose we have ten messages of probabilities: P(m1) = 0.49, P(m2) = 0.14,P(m3) = 0.14, P(m4) = 0.07, P(m5) = 0.07, P(m6)=.04, P(m7) = 0.02, P(m8) = 0.02, P(m9) = 0.005, P(ml0) = 0.005. Determine the Shanon Fano code for the set of messages. Determine coding efficiency and redundancy.

Question 2) A transmitter has the alphabet consisting of five letters (x1 x2, x3, x4, x5) and receiver has the alphabet of four letters (y1, y2, y3, y4). The probabilities are P (x1) = 0.2 5, P(x2) = 0. 4, P(x3) = 0.1 5, P(x4) = 0.15, P(x5) = 0.05 and

P(Y/X) =

         Y1         Y       y3       y4
X1      1          0          0          0
X2    0.25     0.75       0          0
X     0        1/3       2/3        0
X4      0          0         1/3      2/3
X     0          0           1          0

Compute entropies {H(Y), H(X, Y), H(X), H(X/Y), H(Y/X)} and I(X, Y).

Question 3)a) State and describe Sampling theorem in detail. Explain how is it useful in communication systems?

b) Explain Shanon’s Hartley theorem based on channel capacity. How does channel capacity modify if bandwidth is increased to infinity? Explain the orthogonal signalling performance on the basis of theorems.

Question 4) Describe the significance of the following:

a) Companding.

b) Prediction filters.

c) Adaptive filters.

d) Equalization

Request for Solution File

Ask an Expert for Answer!!
Electrical Engineering: Determinining entropies and shanon fano code for messages
Reference No:- TGS09546

Expected delivery within 24 Hours