Start Discovering Solved Questions and Your Course Assignments
TextBooks Included
Active Tutors
Asked Questions
Answered Questions
Using a symbolic computation program (e.g., Mathematica or Maple), find the weight enumeration polynomial for a (15, 11) Hamming code
Find the capacity of the cascade connection of n binary symmetric channels with the same crossover probability e.
Let C denote the capacity of the third channel and C1 and C2 represent the capacities of the first and second channels.
Let C denote the capacity of a discrete memoryless channel with input alphabet X = {x1, x2,..., xN}.
Find the input probability distribution that achieves capacity.
Find theFind the capacity of an additive white Gaussian noise channel with a bandwidth 1 MHz, power 10 W.
For each source output one use of channel is possible. The fidelity measure is squared-error distortion, i.e., d(x,xˆ ) = (x -xˆ )2.
Assume that this channel is used with optimal hard decision decoding at the output. What is the crossover probability of the resulting BSC channel?
Consider the two channels with the transition probabilities as shown in Figure.
Assume that the source is directly connected to the channel; i.e., no coding is employed. What is the error probability at the destination?
If Huffman codes were designed for the second extension of these sources (i.e., two letters at a time), for which source would you expect a performance .
If a discrete-time memory less Gaussian source with a variance of 4 is to be transmitted by this channel, and for each source output, two uses of channel.
A telephone channel has a bandwidth W = 3000 Hz and a signal-to-noise power ratio of 400 (26 dB).
Consider the binary-input, quaternary-output DMC shown in Figure.
Determine the value of a that maximizes I(X; Y ), i.e., the channel capacity C in bits per channel use, and plot C as a function of p for the optimum value.
Determine the probability of having at least one bit error in a codeword transmitted over the BSC.
What is the absolute minimum Eb/N0 required to be able to transmit the source reliably, assuming that hard decision decoding is employed by the channel .
A discrete memory less source U is to be transmitted over a memory less communication channel.
Channel C1 is an additive white Gaussian noise channel with a bandwidth W, average transmitter power P, and noise power spectral density ½N0.
On the same axis, plot the capacity of the same channel when binary orthogonal signaling is employed.
Compare the average codeword length with the entropy of the source.
Design Huffman codes for each source. Which Huffman code is more efficient?
A DMS has an alphabet of eight letters xi, i = 1, 2,..., 8, with probabilities 0.25, 0.20, 0.15, 0.12, 0.10, 0.08, 0.05, and 0.05..
Assume that we have a binary symmetric channel with crossover probability e = 0.3. Is it possible to transmit the source reliably over the channel?
An additive white Gaussian noise channel has the output Y = X + N, where X is the channel input and N is the noise with probability density function.