# 202341618 397 P COMPLETE SOLUTIONS Elements Of Information Theory 2nd Edition COMPLETE Solutions Manual Chapters 1 17

**Topics:**Random variable, Probability theory, Information theory

**Pages:**397 (124165 words)

**Published:**February 10, 2015

Second Edition

Solutions to Problems

Thomas M. Cover

Joy A. Thomas

October 17, 2006

1

COPYRIGHT 2006

Thomas Cover

Joy Thomas

All rights reserved

2

Contents

1 Introduction

7

2 Entropy, Relative Entropy and Mutual Information

9

3 The Asymptotic Equipartition Property

49

4 Entropy Rates of a Stochastic Process

61

5 Data Compression

97

6 Gambling and Data Compression

139

7 Channel Capacity

163

8 Differential Entropy

203

9 Gaussian channel

217

10 Rate Distortion Theory

241

11 Information Theory and Statistics

273

12 Maximum Entropy

301

13 Universal Source Coding

309

14 Kolmogorov Complexity

321

15 Network Information Theory

331

16 Information Theory and Portfolio Theory

377

17 Inequalities in Information Theory

391

3

4

CONTENTS

Preface

Here we have the solutions to all the problems in the second edition of Elements of Information Theory. First a word about how the problems and solutions were generated. The problems arose over the many years the authors taught this course. At first the homework problems and exam problems were generated each week. After a few years of this double duty, the homework problems were rolled forward from previous years and only the exam problems were fresh. So each year, the midterm and final exam problems became candidates for addition to the body of homework problems that you see in the text. The exam problems are necessarily brief, with a point, and reasonable free from time consuming calculation, so the problems in the text for the most part share these properties. The solutions to the problems were generated by the teaching assistants and graders for the weekly homework assignments and handed back with the graded homeworks in the class immediately following the date the assignment was due. Homeworks were optional and did not enter into the course grade. Nonetheless most students did the homework. A list of the many students who contributed to the solutions is given in the book acknowledgment. In particular, we would like to thank Laura Ekroot, Will Equitz, Don Kimber, Mitchell Trott, Andrew Nobel, Jim Roche, Vittorio Castelli, Mitchell Oslick, Chien-Wen Tseng, Michael Morrell, Marc Goldberg, George Gemelos, Navid Hassanpour, Young-Han Kim, Charles Mathis, Styrmir Sigurjonsson, Jon Yard, Michael Baer, Mung Chiang, Suhas Diggavi, Elza Erkip, Paul Fahn, Garud Iyengar, David Julian, Yiannis Kontoyiannis, Amos Lapidoth, Erik Ordentlich, Sandeep Pombra, Arak Sutivong, Josh Sweetkind-Singer and Assaf Zeevi. We would like to thank Prof. John Gill and Prof. Abbas El Gamal for many interesting problems and solutions.

The solutions therefore show a wide range of personalities and styles, although some of them have been smoothed out over the years by the authors. The best way to look at the solutions is that they offer more than you need to solve the problems. And the solutions in some cases may be awkward or inefficient. We view that as a plus. An instructor can see the extent of the problem by examining the solution but can still improve his or her own version. The solution manual comes to some 400 pages. We are making electronic copies available to course instructors in PDF. We hope that all the solutions are not put up on an insecure website—it will not be useful to use the problems in the book for homeworks and exams if the solutions can be obtained immediately with a quick Google search. Instead, we will put up a small selected subset of problem solutions on our website, http://www.elementsofinformationtheory.com, available to all. These will be problems that have particularly elegant or long solutions that would not be suitable homework or exam problems.

5

CONTENTS

6

We have also seen some people trying to sell the solutions manual on Amazon or Ebay. Please note that the Solutions Manual for Elements of Information Theory is...

Bibliography: Proc. National Acad. Sci. U.S., 36:31–35, 1950.

Soc., 51:414–421, 1955.

[6] R.G. Gallager. Information Theory and Reliable Communication. Wiley, New York,

1968.

[7] R.G. Gallager. Variations on a theme by Huffman. IEEE Trans. Inform. Theory, IT24:668–674, 1978.

[8] L. Lovasz. On the Shannon capacity of a graph. IEEE Trans. Inform. Theory, IT-25:1–7,

1979.

Veb Deutscher Verlag der Wissenschaften, Berlin, 1962.

[12] C.E. Shannon. Communication theory of secrecy systems. Bell Sys. Tech. Journal,

28:656–715, 1949.

[13] C.E. Shannon. Coding theorems for a discrete source with a fidelity criterion. IRE

National Convention Record, Part 4, pages 142–163, 1959.

Stat. Prob., volume 1, pages 611–644. Univ. California Press, 1961.

[15] J.A. Storer and T.G. Szymanski. Data compression via textual substitution. J. ACM,

29(4):928–951, 1982.

Please join StudyMode to read the full document