Synopses & Reviews
The second edition of this book is unique in that it focuses on methods for making formal statistical inference from all the models in an a priori set (Multi-Model Inference). A philosophy is presented for model-based data analysis and a general strategy outlined for the analysis of empirical data. The book invites increased attention on a priori science hypotheses and modeling. Kullback-Leibler Information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection. The maximized log-likelihood function can be bias-corrected as an estimator of expected, relative Kullback-Leibler information. This leads to Akaike's Information Criterion (AIC) and various extensions. These methods are relatively simple and easy to use in practice, but based on deep statistical theory. The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are objective and practical to employ across a very wide class of empirical problems. The book presents several new ways to incorporate model selection uncertainty into parameter estimates and estimates of precision. An array of challenging examples is given to illustrate various technical issues. This is an applied book written primarily for biologists and statisticians wanting to make inferences from multiple models and is suitable as a graduate text or as a reference for professional analysts.
Synopsis
We wrote this book to introduce graduate students and research workers in various scienti?c disciplines to the use of information-theoretic approaches in the analysis of empirical data. These methods allow the data-based selection of a best model and a ranking and weighting of the remaining models in a pre-de?ned set. Traditional statistical inference can then be based on this selected best model. However, we now emphasize that information-theoretic approaches allow formal inference to be based on more than one model (m- timodel inference). Such procedures lead to more robust inferences in many cases, and we advocate these approaches throughout the book. The second edition was prepared with three goals in mind. First, we have tried to improve the presentation of the material. Boxes now highlight ess- tial expressions and points. Some reorganization has been done to improve the ?ow of concepts, and a new chapter has been added. Chapters 2 and 4 have been streamlined in view of the detailed theory provided in Chapter 7. S- ond, concepts related to making formal inferences from more than one model (multimodel inference) have been emphasized throughout the book, but p- ticularly in Chapters 4, 5, and 6. Third, new technical material has been added to Chapters 5 and 6. Well over 100 new references to the technical literature are given. These changes result primarily from our experiences while giving several seminars, workshops, and graduate courses on material in the ?rst e- tion."
Synopsis
Statisticians and applied scientists often must select a model to fit empirical data. This book introduces researchers and graduate students in many areas to an information criterion approach, first introduced by Hirotugu Akaike in 1973. The book will be of general interest, but the emphasis is on applications to the biological sciences.
Synopsis
Includes bibliographical references (p. [455]-484) and index.
Synopsis
A unique and comprehensive text on the philosophy of model-based data analysis and strategy for the analysis of empirical data. The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data. It contains several new approaches to estimating model selection uncertainty and incorporating selection uncertainty into estimates of precision. An array of examples is given to illustrate various technical issues. The text has been written for biologists and statisticians using models for making inferences from empirical data.
Table of Contents
Introduction * Information and Likelihood Theory: A Basis for Model Selection and Inference * Basic Use of the Information-Theoretic Approach * Formal Inference From More Than One Model: Multi-Model Inference (MMI) * Monte Carlo Insights and Extended Examples * Statistical Theory and Numerical Results * Summary