Synopses & Reviews
The effort to build machines that are able to learn and undertake tasks such as datamining, image processing and pattern recognition has led to the development of artificial neural networks in which learning from examples may be described and understood. The contribution to this subject made over the past decade by researchers applying the techniques of statistical mechanics is the subject of this book. The authors provide a coherent account of various important concepts and techniques that are currently only found scattered in papers, supplement this with background material in mathematics and physics, and include many examples and exercises.
Review
"...they give an exceptionally lucid account not only of what we have learned but also of how the calculations are done...Given the highly techinical nature of the calculations, the presentation is miraculously clear, even elegant. Although I have worked on these problems myself, I found, in reading the chapters, that I kept getting new insights...I highly recommend this book as a way to learn what statistical mathematics can say about an important basic problem." Physics Today
Synopsis
Artificial neural networks, learning, statistical mechanics; background material in mathematics and physics; examples and exercises; textbook/reference.
Synopsis
Learning is natural activity, and it has always been a challenge for us to understand the process. Artificial neural networks provide a simple framework in which learning from examples may be described and understood. The authors provide a coherent account of various important concepts and techniques of statistical mechanics and their application to learning theory, supplement this with background material in mathematics and physics and include many examples and exercises to make a book that can be used with courses, or for self-teaching, or as a handy reference.
Table of Contents
1. Getting started; 2. Perceptron learning - basics; 3. A choice of learning rules; 4. Augmented statistical mechanics formulation; 5. Noisy teachers; 6. The storage problem; 7. Discontinuous learning; 8. Unsupervised learning; 9. On-line learning; 10. Making contact with statistics; 11. A bird's eye view: multifractals; 12. Multilayer networks; 13. On-line learning in multilayer networks; 14. What else?; Appendix A. Basic mathematics; Appendix B. The Gardner analysis; Appendix C. Convergence of the perceptron rule; Appendix D. Stability of the replica symmetric saddle point; Appendix E. 1-step replica symmetry breaking; Appendix F. The cavity approach; Appendix G. The VC-theorem.