ETM2126: INFORMATION THEORY & ERROR CODING Tutorial 2

Tutorial 2: Channel Capacity

1. Two identical binary symmetric channels with transition probability p are connected in cascade. i) Draw the original channel diagram.
ii) Find the overall channel matrix of the resultant channel and then draw the equivalent channel diagram.

2. Find the value of conditional entropy for a noiseless binary channel.

3. A telephone line channel has a bandwidth of 3 kHz and a S/N = 1500 at the channel output. Calculate the channel capacity in bits/sec.

4. A Gaussian channel has a bandwidth of 4 kHz and two-sided noise power spectral density N0/2 of 10-14 Watt/Hz. The signal power at the receiver has to be maintained at a level of 0.1 milliWatt. Calculate the capacity of this channel.

5. Given a Gaussian channel with a bandwidth of 1 MHz and SNR of 30 dB. i) Calculate the capacity of this channel.
ii) How long will it take to transmit one million characters over the channel? Assume that each character is coded as an 8-bit binary word.

6. Alphanumeric data are entered into a computer from a remote terminal through a voice-grade telephone channel. The channel has a bandwidth of 3.4 kHz, and an output signal-to-noise ratio of 20 dB. The terminal has a total of 128 symbols. Assume that the symbols are equiprobable, and the successive transmissions are statistically independent. a) Calculate the channel capacity.

b) Calculate the maximum (theoretical) symbol rate for which error-free transmission over the channel is possible.

7. Suppose a TV displays at a frame rate of 30 frame per second. Each frame consists of 2x105 pixels and each pixel requires 16 bits for colour display. Assuming an SNR of 25 dB, calculate the bandwidth required to support the transmission of the TV video signal.

...Chapter 5
InformationTheory and Coding
In communication systems, informationtheory, pioneered by C. E. Shannon, generally deals with mathematical formulation of the information transfer from one place to another. It is concerned with source coding and channel coding. Source coding attempts to minimize the number of bits required to represent the source output at a given level of efficiency. Channel coding, on the other...

...“How does INFORMATION help you in your daily life?”
Information is a critical resource for decision making. It refers to data that has been processed and put into a meaningful context. . Information plays a significant role in our daily professional and personal lives and we are constantly challenged to take charge of the information that we need for work, fun and everyday decisions and tasks. Information is like...

...Ass.Prof.Dr. Thamer Informationtheory 4th class in Communications
Error Detection and Correction
1. Types of Errors Whenever bits flow from one point to another, they are subject to unpredictable changes because of interference. This interference can change the shape of the signal. In a single-bit error, a 0 is changed to a 1 or a 1 to a 0. The term single-bit error means that only 1 bit of a given data unit (such as a byte, character, or packet) is changed...

...S K C T
DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING
QUESTION BANK
SUB.NAME
: COMMUNICATION THEORY
YEAR /SEM : II / IV
______________________________________________________________________________
UNIT I
AMPLITUDE MODULATION SYSTEMS
PART-A (2 Marks)
1. Define Amplitude Modulation.
2. What is AM wave envelope?
3. Define modulation index for an AM wave.
4. List out the advantages of AM.
5. Define the transmission efficiency of AM signal...

...Models in Mass Communication Theories
1. Definition of a model
2. Functions of a model
3. Evaluation of a model
4. Some early communication models
- Lasswell's Model
- Shanon&Weaver
- The Gerbner Model.
1- Definition of a model:
A model is not an explanatory device by itself, but it helps to formulate theory.
It suggests relationships between variables.
A model provides a frame within which we can consider a problem, even...

...the AI discipline for the machine learning papers in spots 1, 4, and 5 pushed it down. This paper discusses the theory of sending communications down a noisy channel and demonstrates a few key engineering parameters, such as entropy, which is the range of states of a given communication. It’s one of the more fundamental papers of computer science, founding the field of informationtheory and enabling the development of the very tubes through which you...

...623–656, July, October, 1948.
A Mathematical Theory of Communication
By C. E. SHANNON
T
I NTRODUCTION
HE recent development of various methods of modulation such as PCM and PPM which exchange
bandwidth for signal-to-noise ratio has intensiﬁed the interest in a general theory of communication. A
basis for such a theory is contained in the important papers of Nyquist 1 and Hartley2 on this subject. In the
present paper we will extend the...