Jaume Riba and Gregori V´zquez a February 16, 2012

Contents

0.1 0.2 Scope of the course . . . . . . . . . . . . . . . . . . . . . . . . Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 5 8 8 9 9 11 13 14 15 16 17 21 32 32 33 33 34 35 36 36 40 41 42 51 52 53

1 Capacity 1.1 A Deﬁnition of Information . . . . . . . . . . . . . . . . . . . 1.1.1 The discrete memory-less source . . . . . . . . . . . . 1.1.2 Measure of information . . . . . . . . . . . . . . . . . 1.1.3 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1.4 A fundamental inequality . . . . . . . . . . . . . . . . 1.1.5 Maximizing entropy . . . . . . . . . . . . . . . . . . . 1.1.6 Joint entropy of two sources of information . . . . . . 1.1.7 Entropy for random vectors . . . . . . . . . . . . . . . 1.1.8 Uniquely-decodable codes, preﬁx-free codes and KraftMcMillan inequality . . . . . . . . . . . . . . . . . . . 1.1.9 Source coding theorem . . . . . . . . . . . . . . . . . . 1.1.10 Main plot and conclusions . . . . . . . . . . . . . . . . 1.1.11 Self-evaluation . . . . . . . . . . . . . . . . . . . . . . 1.2 Mutual Information and Channel Capacity . . . . . . . . . . 1.2.1 The discrete memory-less channel . . . . . . . . . . . . 1.2.2 Mutual information . . . . . . . . . . . . . . . . . . . 1.2.3 Intermediate concepts: average conditional entropy (randomness and equivocation) . . . . . . . . . . . . . 1.2.4 Complexity + Anticipation = Uncertainty + Action . 1.2.5 Interpretation as expectation of a random variable . . 1.2.6 Mutual information for random vectors . . . . . . . . 1.2.7 Capacity . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.8 Channel coding theorem . . . . . . . . . . . . . . . . . 1.2.9 Self-evaluation . . . . . . . . . . . . . . . . . . . . . . 1.3 Continuous Time/Amplitude Channels . . . . . . . . . . . . . 1.3.1 The AWGN channel . . . . . . . . . . . . . . . . . . . 1

1.3.2 1.3.3 1.3.4 1.3.5 1.3.6 1.3.7 1.3.8

Trying to generalize information theory to continuous sources . . . . . . . . . . . . . . . . . . . . . . . . . . The concept of diﬀerential entropy . . . . . . . . . . . The mutual information has full sense . . . . . . . . . Maximizing diﬀerential entropy . . . . . . . . . . . . . Capacity of the AWGN channel . . . . . . . . . . . . . Parallel channels and water-ﬁlling . . . . . . . . . . . Self-evaluation . . . . . . . . . . . . . . . . . . . . . .

53 54 55 56 58 62 66 68 68 69 71 73 74 76 78 79 80 80 81 81 81 81 81

2 Ideal Channels 2.1 Performance Bounds . . . . . . . . . . . . . . . . . 2.1.1 The optimum decoder . . . . . . . . . . . . 2.1.2 Pair-wise error probability and union bound 2.1.3 Classical union bound . . . . . . . . . . . . 2.1.4 Union-Bhattacharyya bound . . . . . . . . 2.1.5 Gallager bound . . . . . . . . . . . . . . . . 2.2 Orthogonal Signalling . . . . . . . . . . . . . . . . 2.3 Linear Coding . . . . . . . . . . . . . . . . . . . . . 3 Real Channels 3.1 The Channel Response . . . 3.2 Band Limited Transmission 3.3 Channel Model . . . . . . . 3.4 Multi-Carrier Transmission: 3.5 Fading channels . . . . . . . 3.6 Self-evaluation . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . . . . . . . . . OFDM . . . . . . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

4 Multiple access 82 4.1 Self-evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . 82

2

Introduction

0.1 Scope of the course

The basics of communications theory, which is information theory, was developed in 1948 by Claude Shannon in a famous paper entitled ”A mathematical theory of communication”. For 25 years it was an elegant theory and a source of research problems. However, nowadays this...