Synopses & Reviews
Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering—uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms.
This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four chapters are tutorial chapters—Robert Cowell on Inference for Bayesian Networks, David MacKay on Monte Carlo Methods, Michael I. Jordan et al. on Variational Methods, and David Heckerman on Learning with Bayesian Networks. The remaining chapters cover a wide range of topics of current research interest.
Review
"The state of the art presented by the experts in the field." Ross D. Shachter , Department of Engineering-Economic Systemsand Operations Research, Stanford University The MIT Press
Review
The state of the art presented by the experts in the field. The MIT Press
Review
This book deals with an area that is central to modern statistical science and which has also attracted the interest of outstanding researchers beyond the statistical mainstream, from computer science, and neural computing. The book gives a vital and timely overview of current work at this interface, described by contributors representing the complete spectrum of backgrounds. The MIT Press
Review
Learning in Graphical Models is the product of a mutually exciting interaction between ideas, insights, and techniques drawn from the fields of statistics, computer science, and physics. With its authoritative tutorial papers and specialist articles by leading researchers, this collection provides an indispensable guide to a rapidly expanding subject. Michael Titterington, Professor of Statistics, University of Glasgow
Synopsis
Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering -- uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms.
This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four chapters are tutorial chapters -- Robert Cowell on Inference for Bayesian Networks, David MacKay on Monte Carlo Methods, Michael I. Jordan et al. on Variational Methods, and David Heckerman on Learning with Bayesian Networks. The remaining chapters cover a wide range of topics of current research interest.
About the Author
Michael I. Jordan is Professor of Computer Science and of Statistics at the University of California, Berkeley, and recipient of the ACM/AAAI Allen Newell Award.