Information Theory

Only available on StudyMode
  • Topic: Information theory, NTSC, Signal-to-noise ratio
  • Pages : 2 (312 words )
  • Download(s) : 333
  • Published : July 19, 2012
Open Document
Text Preview
ETM2126: INFORMATION THEORY & ERROR CODING Tutorial 2

Tutorial 2: Channel Capacity

1. Two identical binary symmetric channels with transition probability p are connected in cascade. i) Draw the original channel diagram.
ii) Find the overall channel matrix of the resultant channel and then draw the equivalent channel diagram.

2. Find the value of conditional entropy for a noiseless binary channel.

3. A telephone line channel has a bandwidth of 3 kHz and a S/N = 1500 at the channel output. Calculate the channel capacity in bits/sec.

4. A Gaussian channel has a bandwidth of 4 kHz and two-sided noise power spectral density N0/2 of 10-14 Watt/Hz. The signal power at the receiver has to be maintained at a level of 0.1 milliWatt. Calculate the capacity of this channel.

5. Given a Gaussian channel with a bandwidth of 1 MHz and SNR of 30 dB. i) Calculate the capacity of this channel.
ii) How long will it take to transmit one million characters over the channel? Assume that each character is coded as an 8-bit binary word.

6. Alphanumeric data are entered into a computer from a remote terminal through a voice-grade telephone channel. The channel has a bandwidth of 3.4 kHz, and an output signal-to-noise ratio of 20 dB. The terminal has a total of 128 symbols. Assume that the symbols are equiprobable, and the successive transmissions are statistically independent. a) Calculate the channel capacity.

b) Calculate the maximum (theoretical) symbol rate for which error-free transmission over the channel is possible.

7. Suppose a TV displays at a frame rate of 30 frame per second. Each frame consists of 2x105 pixels and each pixel requires 16 bits for colour display. Assuming an SNR of 25 dB, calculate the bandwidth required to support the transmission of the TV video signal.
tracking img