Synopses & Reviews
Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permeated the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future.
To give a solid introduction to this burgeoning field, J. R. Pierce has revised his well-received 1961 study of information theory for an up-to-date second edition. Beginning with the origins of the field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy. language and meaning, efficient encoding , and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. A glossary of terms and an appendix on mathematical notation are provided to help the less mathematically sophisticated.
J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. He is currently affiliated with the engineering department of the California Institute of Technology. While his background is impeccable, Dr. Pierce also possesses an engaging writing style that makes his book all the more welcome. An Introduction to Information Theory continues to be the most impressive non-technical account available and a fascinating introduction to the subject for laymen.
"An uncommonly good study. . . . Pierce's volume presents the most satisfying discussion to be found."― Scientific American.
Synopsis
Covers encoding and binary digits, entropy, language and meaning, efficient encoding and the noisy channel, and explores ways in which information theory relates to physics, cybernetics, psychology, and art. 1980 edition.
Synopsis
Covers encoding and binary digits, entropy, language and meaning, efficient encoding and the noisy channel, and explores ways in which information theory relates to physics, cybernetics, psychology, and art. "Uncommonly good...the most satisfying discussion to be found." —
Scientific American. 1980 edition.
Table of Contents
0. Preface to the Dover Edition
1. The World and Theories
2. The Origins of Information Theory
3. A Mathematical Model
4. Encoding and Binary Digits
5. Entropy
6. Language and Meaning
7. Efficient Encoding
8. The Noisy Channel
9. Many Dimensions
10. Information Theory and Physics
11. Cybernetics
12. Information Theory and Psychology
13. Information Theory and Art
14. Back to Communication Theory
Appendix: On Mathematical Notation
Glossary
Index