Synopses & Reviews
This book provides a compact selfcontained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers having a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice. Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The opensource R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example Rcode is provided throughout the text. Much of the example code can be run ``as is in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book. Peter Hoff is an Associate Professor of Statistics and Biostatistics at the University of Washington. He has developed a variety of Bayesian methods for multivariate data, including covariance and copula estimation, cluster analysis, mixture modeling and social network analysis. He is on the editorial board of the Annals of Applied Statistics.
Review
From the reviews: This is an excellent book for its intended audience: statisticians who wish to learn Bayesian methods. Although designed for a statistics audience, it would also be a good book for econometricians who have been trained in frequentist methods, but wish to learn Bayes. In relatively few pages, it takes the reader through a vast amount of material, beginning with deep issues in statistical methodology such as de Finetti's theorem, through the nittygritty of Bayesian computation to sophisticated models such as generalized linear mixed effects models and copulas. And it does so in a simple manner, always drawing parallels and contrasts between Bayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal) "Generally, I think this is an excellent choice for a text for a onesemester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariateresponse modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and MetropolisHastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. ...I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R.
Review
From the reviews:
This is an excellent book for its intended audience: statisticians who wish to learn Bayesian methods. Although designed for a statistics audience, it would also be a good book for econometricians who have been trained in frequentist methods, but wish to learn Bayes. In relatively few pages, it takes the reader through a vast amount of material, beginning with deep issues in statistical methodology such as de Finetti's theorem, through the nittygritty of Bayesian computation to sophisticated models such as generalized linear mixed effects models and copulas. And it does so in a simple manner, always drawing parallels and contrasts between Bayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal) "Generally, I think this is an excellent choice for a text for a onesemester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariateresponse modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and MetropolisHastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. ...I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R.
Synopsis
A selfcontained introduction to probability, exchangeability and Bayes' rule provides a theoretical understanding of the applied material. Numerous examples with Rcode that can be run "asis" allow the reader to perform the data analyses themselves. The development of Monte Carlo and Markov chain Monte Carlo methods in the context of data analysis examples provides motivation for these computational methods.
Synopsis
This compact, selfcontained introduction to the theory and application of Bayesian statistical methods is accessible to those with a basic familiarity with probability, yet allows advanced readers to grasp the principles underlying Bayesian theory and method.
Synopsis

A selfcontained introduction to probability, exchangeability and Bayes' rule provides a theoretical understanding of the applied material.

Numerous examples with Rcode that can be run "asis" allow the reader to perform the data analyses themselves.

The development of Monte Carlo and Markov chain Monte Carlo methods in the context of data analysis examples provides motivation for these computational methods.
Table of Contents
Introduction and examples. Belief, probability and exchangeability. One parameter models. Monte Carlo approximation. The normal model. Posterior approximation with the Gibbs sampler. The multivariate normal model. Group comparisons and hierarchical modeling. Linear regression. Nonconjugate priors and the MetropolisHastings algorithm. Linear and generalized linear mixed effects models. Latent variable methods for ordinal data.