Mega Dose
 
 

Special Offers see all

Enter to WIN a $100 Credit

Subscribe to PowellsBooks.news
for a chance to win.
Privacy Policy

Tour our stores


    Recently Viewed clear list


    Original Essays | September 18, 2014

    Lin Enger: IMG Knowing vs. Knowing



    On a hot July evening years ago, my Toyota Tercel overheated on a flat stretch of highway north of Cedar Rapids, Iowa. A steam geyser shot up from... Continue »

    spacer

Learning in Graphical Models (Adaptive Computation and Machine Learning)

by

Learning in Graphical Models (Adaptive Computation and Machine Learning) Cover

 

Synopses & Reviews

Publisher Comments:

Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering—uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms.

This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four chapters are tutorial chapters—Robert Cowell on Inference for Bayesian Networks, David MacKay on Monte Carlo Methods, Michael I. Jordan et al. on Variational Methods, and David Heckerman on Learning with Bayesian Networks. The remaining chapters cover a wide range of topics of current research interest.

Synopsis:

Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering — uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms.

This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four chapters are tutorial chapters — Robert Cowell on Inference for Bayesian Networks, David MacKay on Monte Carlo Methods, Michael I. Jordan et al. on Variational Methods, and David Heckerman on Learning with Bayesian Networks. The remaining chapters cover a wide range of topics of current research interest.

About the Author

Michael I. Jordan is Professor of Computer Science and of Statistics at the University of California, Berkeley, and recipient of the ACM/AAAI Allen Newell Award.

Product Details

ISBN:
9780262600323
Editor:
Jordan, Michael Irwin
Publisher:
A Bradford Book
Editor:
Jordan, Michael Irwin
Author:
Jordan, Michael I.
Author:
Jordan, Michael Irwin
Location:
Cambridge, Mass.
Subject:
Statistics
Subject:
Applied
Subject:
Graphic Methods
Subject:
Probability
Subject:
Artificial Intelligence
Subject:
Graphical modeling (Statistics)
Subject:
Graphical modeling
Subject:
Estatistica
Subject:
Artificial Intelligence - General
Subject:
Intelligence (AI) & Semantics
Subject:
Mathematics-Applied
Edition Description:
Trade paper
Series:
Adaptive Computation and Machine Learning series Learning in Graphical Models
Series Volume:
DOER-1
Publication Date:
19990120
Binding:
Paperback
Grade Level:
from 17
Language:
English
Illustrations:
Yes
Pages:
644
Dimensions:
10 x 7 in

Other books you might like

  1. Causation Prediction & Search 2ND... New Hardcover $79.75
  2. Designs and Their Codes New Trade Paper $77.50
  3. A World Destroyed: Hiroshima and Its... Used Trade Paper $13.95
  4. Reinforcement Learning: An... Used Hardcover $51.50
  5. Artificial Intelligence: A New Synthesis Used Hardcover $78.00
  6. The Wizards of Armageddon (Stanford... Used Trade Paper $24.00

Related Subjects

Arts and Entertainment » Architecture » General
Computers and Internet » Artificial Intelligence » General
Computers and Internet » Computers Reference » General
Humanities » Philosophy » General
Reference » Science Reference » Philosophy of Science
Science and Mathematics » Mathematics » Applied
Science and Mathematics » Mathematics » General
Science and Mathematics » Mathematics » Probability and Statistics » Probability Theory

Learning in Graphical Models (Adaptive Computation and Machine Learning) New Trade Paper
0 stars - 0 reviews
$92.50 Backorder
Product details 644 pages MIT Press - English 9780262600323 Reviews:
"Synopsis" by , Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering — uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms.

This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four chapters are tutorial chapters — Robert Cowell on Inference for Bayesian Networks, David MacKay on Monte Carlo Methods, Michael I. Jordan et al. on Variational Methods, and David Heckerman on Learning with Bayesian Networks. The remaining chapters cover a wide range of topics of current research interest.

spacer
spacer
  • back to top

FOLLOW US ON...

     
Powell's City of Books is an independent bookstore in Portland, Oregon, that fills a whole city block with more than a million new, used, and out of print books. Shop those shelves — plus literally millions more books, DVDs, and gifts — here at Powells.com.