Synopses & Reviews
This book is an introduction to information and coding theory at the graduate or advanced undergraduate level. It assumes a basic knowledge of probability and modern algebra, but is otherwise self- contained. The intent is to describe as clearly as possible the fundamental issues involved in these subjects, rather than covering all aspects in an encyclopedic fashion. The first quarter of the book is devoted to information theory, including a proof of Shannon's famous Noisy Coding Theorem. The remainder of the book is devoted to coding theory and is independent of the information theory portion of the book. After a brief discussion of general families of codes, the author discusses linear codes (including the Hamming, Golary, the Reed-Muller codes), finite fields, and cyclic codes (including the BCH, Reed-Solomon, Justesen, Goppa, and Quadratic Residue codes). An appendix reviews relevant topics from modern algebra.
Synopsis
An introduction to information and coding theory at the graduate or advanced undergraduate level. It assumes a basis knowledge of probability and modern algebra, but is otherwise self-contained. The first quarter of the text is devoted to information theory; the remainder is devoted to coding theory
Table of Contents
1: Entropy. 2: Noisless Coding. 3: Noisy Coding. 4: General Remarks on Codes. 5: Linear Codes. 6: Some Linear Codes. 7: Finite Fields and Cyclic Codes. 8: Some Cyclic Codes.