Synopses & Reviews
Communication, one of the most important functions of life, occurs at any spatial scale from the molecular one up to that of populations and ecosystems, and any time scale from that of fast chemical reactions up to that of geological ages. Information theory, a mathematical science of communication initiated by Shannon in 1948, has been very successful in engineering, but biologists ignore it. This book aims at bridging this gap. It proposes an abstract definition of information based on the engineers' experience which makes it usable in life sciences. It expounds information theory and error-correcting codes, its by-products, as simply as possible. Then, the fundamental biological problem of heredity is examined. It is shown that biology does not adequately account for the conservation of genomes during geological ages, which can be understood only if it is assumed that genomes are made resilient to casual errors by proper coding. Moreover, the good conservation of very old parts of genomes, like the /HOX/ genes, implies that the assumed genomic codes have a nested structure which makes an information the more resilient to errors, the older it is. The consequences that information theory draws from these hypotheses meet very basic but yet unexplained biological facts, e.g., the existence of successive generations, that of discrete species and the trend of evolution towards complexity. Being necessarily inscribed on physical media, information appears as a bridge between the abstract and the concrete. Recording, communicating and using information exclusively occur in the living world. Information is thus coextensive with life and delineates the border between the living and the inanimate.
This book begins by defining information theory and error-correcting codes, its by-products, as simply as possible. It then examines the basic biological problem of heredity, showing that proper coding explains the resilience of genomes over geological time.
Table of Contents
Foreword Information and Life by Gérard Battail Donald R. Forsdyke Author's Foreword 1 Introduction Part I Information as a scientific entity 2 What is information? 2.1 Information in a usual meaning 2.2 Features of information as a scientific entity 2.3 Comments on the definitions of information 2.4 An information as a nominable entity 2.4.1 Naming and counting 2.4.2 Defining and representing natural integers 2.4.3 Concept of nominable entity 2.4.4 Representatives of nominable entities need to be protected 2.5 Short history of communication engineering 2.6 Communication over space or over time 3 Basic principles of communication engineering 3.1 Physical inscription of a single symbol 3.2 Physical inscription of a sequence 3.2.1 Symbols and sequences 3.2.2 Representing a sequence of symbols by a sequence of signals 3.3 Receiving a binary symbol in the presence of noise 3.4 Communicating sequences in the presence of noise: channel coding 3.4.1 Channel coding is needed 3.4.2 Redundancy enables channel coding 4 Information theory as the science of literal communication 4.1 Shannon's paradigm and its variants 4.1.1 Basic paradigm 4.1.2 Variants of Shannon's paradigm 4.1.3 Functions and limits of the coding processes 4.2 Quantitative measures of information 4.2.1 Principle of information measurement 4.2.2 Proper and mutual information 4.2.3 Entropy and average mutual information 4.2.4 Properties of entropy and of the mean mutual information 4.2.5 Information rates; extension of a source 4.2.6 Cross-entropy 4.2.7 Comments on the measurement of information 4.3 Source coding 4.3.1 Source models 4.3.2 Representation of a code by a tree, Kraft inequality 4.3.3 Fundamental theorem of source coding 4.3.4 Source coding by the Huffman algorithm 4.3.5 Some comments about source coding 5 Channel capacity and channel coding 5.1 Channel models 5.2 Capacity of a channel 5.2.1 Defining the capacity of a channel 5.2.2 Capacity of simple discrete input channels 5.2.3 Capacity of the additive white Gaussian noise channel 5.2.4 Kolmogorov's ε-entropy 5.3 Channel coding needs redundancy 5.4 On the fundamental theorem of channel coding 5.4.1 A geometrical interpretation of channel coding 5.4.2 Random coding, its geometrical interpretation 5.4.3 Random coding for the binary erasure channel 5.4.4 Largest minimum distance of error-correcting codes 5.4.5 General case: Feinstein's lemma 5.5 Error-correcting codes 5.5.1 Defining an error-correcting code 5.5.2 Using error-correcting codes: decoding and regeneration 5.5.3 Designing error-correcting codes 5.5.4 Recursive convolutional codes 5.5.5 Turbocodes 5.5.6 Low-density parity-check codes 5.5.7 Decoding random-like codes: principles 5.5.8 Decoding an LDPC code 5.5.9 Decoding a turbocode 5.5.10 Variants and comments 5.5.11 Error-correcting codes defined by non-mathematical constraints: soft codes 6 Information as a fundamental entity 6.1 Algorithmic information theory 6.2 Emergent information in populations 6.3 Physical entropy and information 6.3.1 Thermodynamics and physical entropy 6.3.2 Boltzmann constant as a signal-to-noise ratio 6.3.3 Exorcizing Laplace's demon 6.3.4 Information is not a physical entity 6.4 Information bridges the abstract and the concrete Part II Information is coextensive with life 7 An introduction to the second part 149 7.1 Relationship with biosemiotics 7.2 Content and spirit of the second part 8 Heredity as a communication problem 8.1 The enduring genome 8.1.1 A blatant contradiction 8.1.2 An upper bound on the DNA channel capacity 8.1.3 Main hypothesis: genomic error-correcting codes must exist 8.1.4 Subsidiary hypothesis: nested codes 8.2 Consequences meet biological reality 8.2.1 Genomes are redundant 8.2.2 Discrete species exist with a hierarchical taxonomy 8.2.3 Nature proceeds with successive generations 8.2.4 Evolution is contingent and saltationist 8.2.5 Evolution trends towards increasing complexity 8.2.6 Some comments about the consequence of the hypotheses 8.3 A toy living world 8.3.1 A toy living world in order to mimic the real world 8.3.2 Permanence of a 'genome' 8.3.3 Populations of individuals within species 8.3.4 An illustrative simulation 8.3.5 Natural selection in the toy living world 8.4 Identifying genomic error-correcting codes 9 Information is specific to life 9.1 Information and life are indissolubly linked 9.2 Semantic feedback loops 9.2.1 Semantic feedback loops and genetic mapping 9.2.2 Semantic feedbacks implement Barbieri's organic codes 9.2.3 Semantic feedback loops are compatible with evolution 9.2.4 Conjecture about the origin of semantic feedback loops 9.3 Information as a fundamental entity 9.3.1 Information is an abstract entity 9.3.2 On the epistemological status of information 9.4 Nature as an engineer 10 Life within the physical world 10.1 A poorly understood divide 10.2 Maxwell's demon in physics and in life 10.3 A measurement as a means for acquiring information 11 Conclusion Appendix A Tribute to Shannon A.1 Introduction A.2 His life A.3 His work: information theory A.4 Shannon's influence A.5 Shannon's legacy Appendix B Some comments about mathematics B.1 Physical world and mathematics B.2 On numbers B.3 Definitions and notations in the book B.3.1 Exponentials and logarithms B.3.2 Representing symbols and sequences B.3.3 Probabilities Appendix C A short glossary of molecular genetics Index.