Synopses & Reviews
This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modeling probability density functions and the properties and merits of the multi-layer perceptron and radial basis function network models. Also covered are various forms of error functions, principal algorithms for error function minimalization, learning and generalization in neural networks, and Bayesian techniques and their applications. Designed as a text, with over 100 exercises, this fully up-to-date work will benefit anyone involved in the fields of neural computation and pattern recognition.
Review
"Should be in the library of any student, teacher, or researcher with a keen interest in modern statistical methods, a large volume of meaningful data to analyze (including simulations), and a fast workstation with good numerical and graphical capabilities."--Journal of the American Statistical Association
"....should be warmly welcomed by the neural network and pattern recognition communities. Bishop can be recommended to students and engineers in computer science."--Computer Journal
"An excellent and rigorous treatment of a number of neural network architectures."--Journal of Mathematical Psychology
"Its sequential organization and end-of-chapter exercises make it an ideal mental gymnasium. The author has eschewed biological metaphor and sweeping statements in favour of welcome mathematical rigour."--Scientific Computing World
"A first-class book for the researcher in statistical pattern recognition."--Times Higher Education Supplement
"Although there has been a plethora of books on neural networks published in the last five years, none has really addressed the subject with the necessary mathematical rigour. Professor Bishop's book is the first textbook to provide a clear and comprehensive treatment of the mathematical principles underlying the main types of artificial neural networks."--Dr. L. Tarassenko and Professor J.M. Brady, Department of Engineering Science, University of Oxford
"There has been an acute need for authoritative textbooks in neural networks that explain the main ideas clearly and consistently using the basic tools of linear algebra, calculus, and simple probability theory. There have been many attempts to provide such a text, but until now, none has succeeded. This is a serious attempt at providing such an ideal textbook. By concentrating on pattern recognition aspects of neural works, the author is able to treat many important topics in much greater depth. The most important contribution of the book is the solid statistical pattern recognition approach, a sign of increasing maturity in the field."--Mathematical Reviews
"The following keywords concisely indicate the contents: artificial neural networks, statistical pattern recognition, probability density estimation, single-layer networks, multi-layer perception, radial basis functions, error functions, parameter optimization algorithms, Bayesian techniques, etc. The book is aimed at researchers and practitioners. It can also be used as the primary text in a course for graduate students (129 graded exercises!)."--Industrial Mathematics
Description
Includes biliographical references (p. [457]-475) and index.
Table of Contents
1. Statistical Pattern Recognition
2. Probability Density Estimation
3. Single-Layer Networks
4. The Multi-layer Perceptron
5. Radial Basis Functions
6. Error Functions
7. Parameter Optimization Algorithms
8. Pre-processing and Feature Extraction
9. Learning and Generalization
10. Bayesian Techniques
A. Symmetric Matrices
B. Gaussian Integrals
C. Lagrange Multipliers
D. Calculus of Variations
E. Principal Components
References
Index