Synopses & Reviews
The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.
Comprehensive, rigorous introduction to work of Shannon, McMillan, Feinstein, and Khinchin. Translated by R. A. Silverman and M. D. Friedman.
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
Table of Contents
The Entropy Concept In Probability Theory
1. Entropy of Finite Schemes
2. The Uniqueness Theorem
3. Entropy of Markov chains
4. Fundamental Theorems
5. Application to Coding Theory
On the Fundamental Theorems of Information Theory
CHAPTER I. Elementary Inequalities
1. Two generalizations of Shannon's inequality
2. Three inequalities of Feinstein
CHAPTER II. Ergodic Sources
3. Concept of a source. Stationarity. Entropy
4. Ergodic Sources
5. The E property. McMillan's theorem.
6. The martingale concept. Doob's theorem.
7. Auxillary propositions
8. Proof of McMillan's theorem.
CHAPTER III. Channels and the sources driving them
9. Concept of channel. Noise. Stationarity. Anticipation and memory
10. Connection of the channel to the source
11. The ergodic case
CHAPTER IV. Feinstein's Fundamental Lemma
12. Formulation of the problem
13. Proof of the lemma
CHAPTER V. Shannon's Theorems
15. The first Shannon theorem
16. The second Shannon theorem