- Used Books
- Staff Picks
- Gifts & Gift Cards
- Sell Books
- Stores & Events
- Let's Talk Books
Special Offers see all
More at Powell's
Recently Viewed clear list
Currently out of stock.
available for shipping or prepaid pickup only
Other titles in the Wiley Series in Probability and Statistics series:
A Matrix Handbook for Statisticians (Wiley Series in Probability and Statistics)by George A. F. Seber
Synopses & Reviews
A comprehensive, must-have handbook of matrix methods with a unique emphasis on statistical applications
This timely book, A Matrix Handbook for Statisticians, provides a comprehensive, encyclopedic treatment of matrices as they relate to both statistical concepts and methodologies. Written by an experienced authority on matrices and statistical theory, this handbook is organized by topic rather than mathematical developments and includes numerous references to both the theory behind the methods and the applications of the methods. A uniform approach is applied to each chapter, which contains four parts: a definition followed by a list of results; a short list of references to related topics in the book; one or more references to proofs; and references to applications. The use of extensive cross-referencing to topics within the book and external referencing to proofs allows for definitions to be located easily as well as interrelationships among subject areas to be recognized.
A Matrix Handbook for Statisticians addresses the need for matrix theory topics to be presented together in one book and features a collection of topics not found elsewhere under one cover. These topics include:
Additional topics, such as rank, eigenvalues, determinants, norms, generalized inverses, linear and quadratic equations, differentiation, and Jacobians, are also included. The book assumes a fundamental knowledge of vectors and matrices, maintains a reasonable level of abstraction when appropriate, and provides a comprehensive compendium of linear algebra results with use or potential use in statistics. A Matrix Handbook for Statisticians is an essential, one-of-a-kind book for graduate-level courses in advanced statistical studies including linear and nonlinear models, multivariate analysis, and statistical computing. It also serves as an excellent self-study guide for statistical researchers.
Book News Annotation:
Seber (statistics emeritus, U. of Auckland) tales a topical rather than mathematical development approach and includes many references to the theory behind the methods and applications of the methods. Seber covers vectors, vector spaces, convexity, rank matrix functions (inverse, transpose, trace, determinant and norm), related matrices (including complex and Hermitian), eigenvalues, eigenvectors, singular values, generalized inverses, special matrices, non-negative vectors and matrices, positive definite and non-negative definite matrices, special products and operators, inequalities, linear equations, partitioned matrices, patterned matrices, factorization of matrices, differentiation, Jacobians, matrix limits (including sequences and series), random vectors and matrices, inequalities for probabilities and random variables, majorization, optimization and matrix approximation. Each chapter includes a definition followed by a list of results, a short list of references to related topics in the volume, one or more references to proofs, and references to applications. The result is a well-balanced text that also serves as a professional reference. Annotation ©2008 Book News, Inc., Portland, OR (booknews.com)
About the Author
George A. F. Seber, PhD, is Emeritus Professor in the Department of Statistics at The University of Auckland in New Zealand. A Fellow of the New Zealand Royal Society, he is the author or coauthor of several books, including Nonlinear Regression, Multivariate Observations, Adaptive Sampling, Chance Encounters, and Linear Regression Analysis, Second Edition, all published by Wiley. Dr. Seber's research interests have included statistical ecology, genetics, epidemiology, and adaptive sampling.
Table of Contents
1.1 General Definitions.
1.2 Some Continuous Univariate Distributions.
1.3 Glossary of Notation.
2. Vectors, Vector Spaces, and Convexity.
2.1 Vector Spaces.
2.1.2 Quadratic Subspaces.
2.1.3 Sums and Intersections of Subspaces.
2.1.4 Span and Basis.
2.2 Inner Products.
2.2.1 Definition and Properties.
2.2.4 Column and Null Spaces.
2.3.1 General Projections.
2.3.2 Orthogonal Projections.
2.4 Metric Spaces.
2.5 Convex Sets and Functions.
2.6 Coordinate Geometry.
2.6.1 Hyperplanes and Lines.
2.6.3 Miscellaneous Results.
3.1 Some General Properties.
3.2 Matrix Products.
3.3 Matrix Cancellation Rules.
3.4 Matrix Sums.
3.5 Matrix Differences.
3.6 Partitioned Matrices.
3.7 Maximal and Minimal Ranks.
3.8 Matrix Index.
4. Matrix Functions: Inverse, Transpose, Trace, Determinant, and Norm.
4.4.2 Adjoint Matrix.
4.4.3 Compound Matrix.
4.4.4 Expansion of a Determinant.
4.6.1 Vector Norms.
4.6.2 Matrix Norms.
4.6.3 Unitarily Invariant Norms.
4.6.4 M,N-Invariant Norms.
4.6.5 Computational Accuracy.
5. Complex, Hermitian, and Related Matrices.
5.1 Complex Matrices.
5.1.1 Some General Results.
5.2 Hermitian Matrices.
5.3 Skew-Hermitian Matrices.
5.4 Complex Symmetric Matrices.
5.5 Real Skew-Symmetric Matrices.
5.6 Normal Matrices.
6. Eigenvalues, Eigenvectors, and Singular Values.
6.1 Introduction and Definitions.
6.1.1 Characteristic Polynomial.
6.1.3 Singular Values.
6.1.4 Functions of a Matrix.
6.1.6 Hermitian Matrices.
6.1.7 Computational Methods.
6.1.8 Generalized Eigenvalues.
6.1.9 Matrix Products 103.
6.2 Variational Characteristics for Hermitian Matrices.
6.3 Separation Theorems.
6.4 Inequalities for Matrix Sums.
6.5 Inequalities for Matrix Differences.
6.6 Inequalities for Matrix Products.
6.7 Antieigenvalues and Antieigenvectors.
7. Generalized Inverses.
7.2 Weak Inverses.
7.2.1 General Properties.
7.2.3 Sums and Differences.
7.2.4 Real Symmetric Matrices.
7.2.5 Decomposition Methods.
7.3 Other Inverses.
7.3.1 Reflexive (g12) Inverse.
7.3.2 Minimum Norm (g14) Inverse.
7.3.3 Minimum Norm Reflexive (g124) Inverse.
7.3.4 Least Squares (g13) Inverse.
7.3.5 Least Squares Reflexive (g123) Inverse.
7.4 Moore-Penrose (g1234) Inverse.
7.4.1 General Properties.
7.5 Group Inverse.
7.6 Some General Properties of Inverses.
8. Some Special Matrices.
8.1 Orthogonal and Unitary Matrices.
8.2 Permutation Matrices.
8.3 Circulant, Toeplitz, and Related Matrices.
8.3.1 Regular Circulant.
8.3.2 Symmetric Regular Circulant.
8.3.3 Symmetric Circulant.
8.3.4 Toeplitz Matrix.
8.3.5 Persymmetric Matrix.
8.3.6 Cross-Symmetric (Centrosymmetric) Matrix.
8.3.7 Block Circulant.
8.3.8 Hankel Matrix.
8.4 Diagonally Dominant Matrices.
8.5 Hadamard Matrices.
8.6 Idempotent Matrices.
8.6.1 General Properties.
8.6.2 Sums of Idempotent Matrices and Extensions.
8.6.3 Products of Idempotent Matrices.
8.7 Tripotent Matrices.
8.8 Irreducible Matrices.
8.9 Triangular Matrices.
8.10 Hessenberg Matrices.
8.11 Tridiagonal Matrices.
8.12 Vandermonde and Fourier Matrices.
8.12.1 Vandermonde Matrix.
8.12.2 Fourier Matrix.
8.13 Zero-One (0,1) Matrices.
8.14 Some Miscellaneous Matrices and Arrays.
8.14.1 Krylov Matrix.
8.14.2 Nilpotent and Unipotent Matrices.
8.14.3 Payoff Matrix.
8.14.4 Stable and Positive Stable Matrices.
8.14.6 Z- and M-Matrices.
8.14.7 Three-Dimensional Arrays.
9. Non-Negative Vectors and Matrices.
9.1.2 Modulus of a Matrix.
9.2 Spectral Radius.
9.2.1 General Properties.
9.2.2 Dominant Eigenvalue.
9.3 Canonical Form of a Non-negative Matrix.
9.4 Irreducible Matrices.
9.4.1 Irreducible Non-negative Matrix.
9.4.3 Non-negative and Non-positive Off-Diagonal Elements.
9.4.4 Perron Matrix.
9.4.5 Decomposable Matrix.
9.5 Leslie Matrix.
9.6 Stochastic Matrices.
9.6.1 Basic Properties.
9.6.2 Finite Homogeneous Markov Chain.
9.6.3 Countably Infinite Stochastic Matrix.
9.6.4 Infinite Irreducible Stochastic Matrix.
9.7 Doubly Stochastic Matrices.
10. Positive Definite and Non-negative Definite Matrices.
10.2 Non-negative Definite Matrices.
10.2.1 Some General Properties.
10.2.2 Gram Matrix.
10.2.3 Doubly Non-negative Matrix.
10.3 Positive Definite Matrices.
10.4 Pairs of Matrices.
10.4.1 Non-Negative or Positive Definite Difference.
10.4.2 One or More Non-Negative Definite Matrices.
11. Special Products and Operators.
11.1 Kronecker Product.
11.1.1 Two Matrices.
11.1.2 More Than Two Matrices.
11.2 Vec Operator.
11.3 Vec-Permutation (Commutation) Matrix.
11.4 Generalized Vec-Permutation Matrix.
11.5 Vech Operator.
11.5.1 Symmetric Matrix.
11.5.2 Lower Triangular Matrix.
11.6 Star Operator.
11.7 Hadamard Product.
11.8 Rao-Khatri Product.
12.1 Cauchy-Schwarz inequalities.
12.1.1 Real Vector Inequalities and Extensions.
12.1.2 Complex Vector Inequalities.
12.1.3 Real Matrix Inequalities.
12.1.4 Complex Matrix Inequalities.
12.2 H?older?s Inequality and Extensions.
12.3 Minkowski?s Inequality and Extensions.
12.4 Weighted Means.
12.5 Quasilinearization (Representation Theorems).
12.6 Some Geometrical Properties.
12.7 Miscellaneous Inequalities.
12.7.4 Sums and Products.
12.8 Some Identities.
13. Linear Equations.
13.1 Unknown vector.
13.1.3 Homogeneous Equations.
13.1.4 Restricted Equations.
13.2 Unknown Matrix.
13.2.2 Some Special Cases.
14. Partitioned Matrices.
14.1 Schur Complement.
14.4 Positive and Non-Negative Definite matrices.
14.6 Generalized Inverses.
14.6.1 Weak Inverses.
14.6.2 Moore-Penrose Inverses.
14.7 Miscellaneous partitions.
15. Patterned Matrices.
15.4 Matrices With Repeated Elements and Blocks.
15.5 Generalized Inverses.
15.5.1 Weak Inverses.
15.5.2 Moore-Penrose Inverses.
16. Factorization of Matrices.
16.1 Similarity Reductions.
16.2 Reduction by Elementary Transformations.
16.2.1 Types of Transformation.
16.2.2 Equivalence Relation.
16.2.3 Echelon Form.
16.2.4 Hermite Form.
16.3 Singular Value Decomposition (SVD).
16.4 Triangular Factorizations.
16.5 Orthogonal-Triangular Reductions.
16.6 Further Diagonal or Tridiagonal Reductions.
16.8 Simultaneous Reductions.
16.9 Polar Decomposition.
16.10 Miscellaneous Factorizations.
17. Differentiation and Finite Differences.
17.2 Scalar Differentiation.
17.2.1 Differentiation with Respect to t.
17.2.2 Differentiation With Respect to a Vector Element.
17.2.3 Differentiation With Respect to a Matrix Element.
17.3 Vector Differentiation: Scalar Function.
17.3.1 Basic Results.
17.3.2 x=vec X.
17.3.3 Function of a Function.
17.4 Vector Differentiation: Vector Function.
17.5 Matrix Differentiation: Scalar Function.
17.5.1 General Results.
17.5.2 f = trace.
17.5.3 f = determinant.
17.5.4 f = yrs.
17.5.5 f = eigenvalue.
17.6 Transformation Rules.
17.7 Matrix Differentiation: Matrix Function.
17.8 Matrix Differentials.
17.9 Perturbation Using Differentials.
17.10 Matrix Linear Differential Equations.
17.11 Second Order Derivatives.
17.12 Vector Difference Equations.
18.2 Method of Differentials.
18.3 Further Techniques.
18.3.1 Chain Rule.
18.3.2 Exterior (Wedge) Product of Differentials.
18.3.3 Induced Functional Equations.
18.3.4 Jacobians Involving Transposes.
18.3.5 Patterned Matrices and L-Structures.
18.4 Vector Transformations.
18.5 Jacobians for Complex Vectors and Matrices.
18.6 Matrices with Functionally Independent Elements.
18.7 Symmetric and Hermitian Matrices.
18.8 Skew-Symmetric and Skew-Hermitian Matrices.
18.9 Triangular Matrices.
18.9.1 Linear Transformations.
18.9.2 Nonlinear Transformations of X.
18.9.3 Decompositions With One matrix Skew Symmetric.
18.9.4 Symmetric Y.
18.9.5 Positive Definite Y.
18.9.6 Hermitian Positive Definite Y.
18.9.7 Skew Symmetric Y.
18.9.8 LU Decomposition.
18.10 Decompositions Involving Diagonal Matrices.
18.10.1 Square Matrices.
18.10.2 One Triangular Matrix.
18.10.3 Symmetric and Skew Symmetric Matrices.
18.11 Positive?Definite Matrices.
18.12 Caley Transformation.
18.13 Diagonalizable Matrices.
18.14 Pairs of Matrices.
19. Matrix Limits, Sequences and Series.
19.3 Asymptotically Equivalent Sequences.
19.5 Matrix Functions.
19.6 Matrix Exponentials.
20. Random Vectors.
20.2 Variances and Covariances.
20.3.1 Population Correlations.
20.3.2 Sample Correlations.
20.5 Multivariate Normal Distribution.
20.5.1 Definition and Properties.
20.5.2 Quadratics in
20.5.3 Quadratics and Chi-squared.
20.6 Complex Random Vectors.
20.7 Regression Models.
20.7.1 V is the Identity Matrix.
20.7.2 V is Positive Definite.
20.7.3 V is Non-negative Definite.
20.8 Other Multivariate Distributions.
20.8.1 Multivariate t-Distribution.
20.8.2 Elliptical and Spherical Distributions.
20.8.3 Dirichlet Distributions.
21. Random Matrices.
21.2 Generalized Quadratic Forms.
21.2.1 General Results.
21.2.2 Wishart Distribution.
21.3 Random Samples.
21.3.1 One Sample.
21.3.2 Two Samples.
21.4 Multivariate Linear Model.
21.4.1 Least Squares Estimation.
21.4.2 Statistical Inference.
21.4.3 Two Extensions.
21.5 Dimension Reduction Techniques.
21.5.1 Principal Component Analysis (PCA).
21.5.2 Discriminant Coordinates.
21.5.3 Canonical Correlations and Variates.
21.5.4 Latent Variable Methods.
21.5.5 Classical (Metric) Scaling.
21.6 Procrustes Analysis (Matching Configurations).
21.7 Some Specific Random Matrices.
21.8 Allocation Problems.
21.9 Matrix Variate Distributions.
21.10 Matrix Ensembles.
22. Inequalities for Probabilities and Random Variables.
22.1 General Probabilities.
22.2 Bonferroni-Type Inequalities.
22.3 Distribution-Free Probability Inequalities.
22.3.1 Chebyshev-Type Inequalities.
22.3.2 Kolmogorov-Type Inequalities.
22.3.3 Quadratics and Inequalities.
22.4 Data Inequalities.
22.5 Inequalities for Expectations.
22.6 Multivariate Inequalities.
22.6.1 Convex Subsets.
22.6.3 Inequalities For Other Distributions.
23.1 General Properties.
23.2 Schur Convexity.
23.3 Probabilities and Random variables.
24. Optimization and Matrix Approximation.
24.1 Stationary Values.
24.2 Using Convex and Concave Functions.
24.3 Two General Methods.
24.3.1 Maximum Likelihood.
24.3.2 Least Squares.
24.4 Optimizing a Function of a Matrix.
24.5 Optimal Designs.
What Our Readers Are Saying
Average customer rating based on 1 comment:
Business » Accounting and Finance