Synopses & Reviews
This collection of articles responds to the urgent need for timely and comprehensive reviews in a multidisciplinary, rapidly developing field of research. The book starts out with an extensive introduction to the ideas used in the subsequent chapters, which are all centered around the theme of collective phenomena in neural netwerks: dynamics and storage capacity of networks of formal neurons with symmetric or asymmetric couplings, learning algorithms, temporal association, structured data (software), and structured nets (hardware). The style and level of this book make it particularly useful for advanced students and researchers looking for an accessible survey of today's theory of neural networks.
Synopsis
One of the great intellectual challenges for the next few decades is the question of brain organization. What is the basic mechanism for storage of memory? What are the processes that serve as the interphase between the basically chemical processes of the body and the very specific and nonstatistical operations in the brain? Above all, how is concept formation achieved in the human brain? I wonder whether the spirit of the physics that will be involved in these studies will not be akin to that which moved the founders of the rational foundation of thermodynamics. C. N. Yang 10 The human brain is said to have roughly 10 neurons connected through about 14 10 synapses. Each neuron is itself a complex device which compares and integrates incoming electrical signals and relays a nonlinear response to other neurons. The brain certainly exceeds in complexity any system which physicists have studied in the past. Nevertheless, there do exist many analogies of the brain to simpler physical systems. We have witnessed during the last decade some surprising contributions of physics to the study of the brain. The most significant parallel between biological brains and many physical systems is that both are made of many tightly interacting components.