Synopses & Reviews
In this lively look at both subjects, David Williams convinces Mathematics students of the intrinsic interest of Statistics and Probability, and Statistics students that the language of Mathematics can bring real insight and clarity to their subject. He helps students build the intuition needed, in a presentation enriched with examples drawn from all manner of applications. Statistics chapters present both the Frequentist and Bayesian approaches, emphasizing Confidence Intervals rather than Hypothesis Test, and include Gibbs-sampling techniques for the practical implementation of Bayesian methods. A central chapter gives the theory of Linear Regression and ANOVA, and explains how MCMC methods allow greater flexibility in modeling. C or WinBUGS code is provided for computational examples and simulations.
Review
"This book presents most of the topics usually found in a full-year sequence on probability and mathematical statistics." The American Statistician"This well-written, interesting, and very useful book provides a lively look and interactions between mathematics, statistics, and probability...An excellent book. Highly recommended." CHOICE
Synopsis
Statistics do not lie, nor is probability paradoxical. You just have to have the right intuition. In this lively look at both subjects, David Williams convinces mathematics students of the intrinsic interest of statistics and probability, and statistics students that the language of mathematics can bring real insight and clarity to their subject. The presentation is enriched with examples drawn from all manner of applications. Statistics chapters present both the Frequentist and Bayesian approaches, emphasizing Confidence Intervals rather than Hypothesis Tests. C or WinBUGS code is provided for computational examples and simulations. Many exercises are included; hints or solutions are often provided.
Description
Includes bibliographical references (p. 525-538) and index.
Table of Contents
Preface; 1. Introduction; 2. Events and probabilities; 3. Random variables, means and variances; 4. Conditioning and independence; 5. Generating functions and the central limit theorem; 6. Confidence intervals for 1-parameter models; 7. Conditional pdfs and multi-parameter Bayesian statistics; 8. Linear models, ANOVA etc; 9. Some further probability; 10. Quantum probability and quantum computing; Appendix A. Some prerequisites and addenda; Appendix B. Discussion of some selected exercises; Appendix C. Tables; Appendix D. A small sample of the literature; Bibliography; Index.