Synopses & Reviews
Priced very competitively compared with other textbooks at this level
This gracefully organized textbook reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, numerous figures and tables, and computer simulations to develop and illustrate concepts.
Beginning with an introduction to the basic ideas and techniques in probability theory and progressing to more rigorous topics, Probability and Statistical Inference
studies the Helmert transformation for normal distributions and the waiting time between failures for exponential distributions
develops notions of convergence in probability and distribution
spotlights the central limit theorem (CLT) for the sample variance
introduces sampling distributions and the Cornish-Fisher expansions
concentrates on the fundamentals of sufficiency, information, completeness, and ancillarity
explains Basu's Theorem as well as location, scale, and location-scale families of distributions
covers moment estimators, maximum likelihood estimators (MLE), Rao-Blackwellization, and the Cram?r-Rao inequality
discusses uniformly minimum variance unbiased estimators (UMVUE) and Lehmann-Scheff? Theorems
focuses on the Neyman-Pearson theory of most powerful (MP) and uniformly most powerful (UMP) tests of hypotheses, as well as confidence intervals
includes the likelihood ratio (LR) tests for the mean, variance, and correlation coefficient
summarizes Bayesian methods
describes the monotone likelihood ratio (MLR) property
handles variance stabilizing transformations
provides a historical context for statistics and statistical discoveries
showcases great statisticians through biographical notes
Employing over 1400 equations to reinforce its subject matter, Probability and Statistical Inference is a groundbreaking text for first-year graduate and upper-level undergraduate courses in probability and statistical inference who have completed a calculus prerequisite, as well as a supplemental text for classes in Advanced Statistical Inference or Decision Theory.
Synopsis
This textbook for first-year graduate students reveals the theory of probability and statistical inference using worked examples, exercises, and computer simulations. Mukhopadhyay (University of Connecticut) first introduces the basic ideas and techniques in probability theory, then studies more rigorous topics such as the Helmert transformation for normal distributions; convergence in probability and distribution; the central limit theorem for the sample variance; sample distributions and the Cornish-Fisher expansions; the fundamentals of sufficiency, information, completeness, and ancillary; Basu's Theorem; maximum likelihood estimators (MLEs); the Neyman- Pearson theory of most powerful (MP); Bayesian methods; and variance stabilizing transformations.
Synopsis
This gracefully organized text presents the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, numerous figures and tables, and computer simulations to develop and illustrate concepts. Beginning with the basic ideas and techniques of probability theory and progressing to more rigorous topics, this treatment covers all of the topics typically addressed in a two-semester course in probability and statistical inference for graduate and upper-level undergraduate courses, including hypothesis testing, Bayesian analysis, and sample-size determination. The author reinforces important ideas and special techniques with drills and boxed summaries.
Table of Contents
1. Notions of probability -- 2. Expectations of functions of random variables -- 3. Multivariate random variables -- 4. Functions of random variables and sampling distribution -- 5. Concepts of stochastic convergence -- 6. Sufficiency, completeness, and ancillarity -- 7. Point estimation -- 8. Tests of hypotheses -- 9. Confidence interval estimation -- 10. Bayesian methods -- 11. Likelihood ratio and other tests -- 12. Large-sample inference -- 13. Sample size determination: two-stage procedures.