Synopses & Reviews
"Once in a great while, a landmark computer-science book is published. Computer Architecture: A Quantitative Approach, Second Edition, is such a book. In an era of fluff computer books that are, quite properly, remaindered within weeks of publication, this book will stand the test of time, becoming lovingly dog-eared in the hands of anyone who designs computers or has concerns about the performance of computer programs.
"
- Robert Bernecky,
Dr. Dobb's Journal, April 1998
Computer Architecture: A Quantitative Approach was the first book to focus on computer architecture as a modern science. Its publication in 1990 inspired a new approach to studying and understanding computer design. Now, the second edition explores the next generation of architectures and design techniques with view to the future.
A basis for modern computer architecture
As the authors explain in their preface to the Second Edition, computer architecture itself has undergone significant change since 1990. Concentrating on currently predominant and emerging commercial systems, the Hennessy and Patterson have prepared entirely new chapters covering additional advanced topics:
* Advanced Pipelining: A new chapter emphasizes superscalar and multiple issues.
* Networks: A new chapter examines in depth the design issues for small and large shared-memory multiprocessors.
* Storage Systems: Expanded presentation includes coverage of I/O performance measures.
* Memory: Expanded coverage of caches and memory-hierarchy design addresses contemporary design issues.
* Examples and Exercises: Completely revised on current architectures such as MIPS R4000, Intel 80x86 and Pentium, PowerPC, and HP PA-RISC.
Distinctive presentation
This book continues the style of the first edition, with revised sections on Fallacies and Pitfalls, Putting It All Together and Historical Perspective, and contains entirely new sections on Crosscutting Issues. The focus on fundamental techniques for designing real machines and the attention to maximizing cost/performance are crucial to both students and working professionals. Anyone involved in building computers, from palmtops to supercomputers, will profit from the expertise offered by Hennessy and Patterson.
Synopsis
istorical Perspective, and contains entirely new sections on Crosscutting Issues. The focus on fundamental techniques for designing real machines and the attention to maximizing cost/performance are crucial to both students and working professionals. Anyone involved in building computers, from palmtops to supercomputers, will profit from the expertise offered by Hennessy and Patterson.
Synopsis
to maximizing cost/performance are crucial to both students and working professionals. Anyone involved in building computers, from palmtops to supercomputers, will profit from the expertise offered by Hennessy and Patterson.
About the Author
John L. Hennessy is the President of Stanford University, where he has been a member of the faculty since 1977 in the Departments of Electrical Engineering and Computer Science. Hennessy is a fellow of the IEEE and ACM, a member of the National Academy of Engineering, and a fellow of the American Academy of Arts and Sciences. He received the 2001 Eckert-Mauchly Award for his contributions to RISC technology, shared the John von Neumann award in 2000 with David Patterson, and received the 2001 Seymour Cray Computer Engineering award.
Hennessy's original research group at Stanford developed several of the techniques now in commercial use for optimizing compilers. In 1981, he started the MIPS project at Stanford with a handful of graduate students. After completing the project in 1984, he took a one-year leave from the university to co-found MIPS Computer Systems, which developed one of the first commercial RISC microprocessors. After being acquired by Silicon Graphics in 1991, MIPS Technologies became an independent company in 1998, focusing on microprocessors for the embedded marketplace. As of 2001, over 200 million MIPS microprocessors have been shipped in devices ranging from video games and palmtop computers to laser printers and network switches.
Hennessy's more recent research at Stanford focuses on the area of designing and exploiting multiprocessors. He helped lead the design of the DASH multiprocessor architecture, the first distributed shared-memory multiprocessors supporting cache coherency, and the basis for several commercial multiprocessor designs, including the Silicon Graphics Origin multiprocessors.
David A. Patterson has been teaching computer architecture at the University of California, Berkeley, since joining the faculty in 1977, and holds the Pardee Chair of Computer Science. His teaching has been honored by the ACM and the University of California. In 2000 he won the James H. Mulligan, Jr. Education Medal from IEEE "for inspirational teaching through the development of creative curricula and teaching methodology, for important textbooks, and for effective integration of education and research missions." Patterson has also received the 1995 IEEE Technical Achievement Award for contributions to RISC and shared the 1999 IEEE Reynold B. Johnson Information Storage Award for contributions to RAID. In 2000 he shared the IEEE John von Neumann Medal with John Hennessy "for creating a revolution in computer architecture through their exploration, popularization, and commercialization of architectural innovations." Patterson is a member of the National Academy of Engineering and is a fellow of both the ACM and the IEEE. In the past, he has been chair of the CS division in the EECS department at Berkeley, the ACM SIG in computer architecture, and the Computing Research Association.
At Berkeley, Patterson led the design and implementation of RISC I, likely the first VLSI Reduced Instruction Set Computer. This research became the foundation of the SPARC architecture, currently used by Sun Microsystems, Fujitsu, and others. He was a leader of the Redundant Arrays of Inexpensive Disks (RAID) project, which led to high-performance storage systems from many companies. He was also involved in the Network of Workstations (NOW) project, which led to cluster technology used by Internet companies. These projects earned three dissertation awards from the ACM. His current research project is called Recovery Oriented Computing (ROC), which is developing techniques for building dependable, maintainable, and scalable Internet services.
Table of Contents
1 Fundamentals of Computer Design
2 Instruction Set Principles and Examples
3 Pipelining
4 Advanced Pipelining and Instruction-Level Parallelism
5 Memory-Hierarchy Design
6 Storage Systems
7 Interconnectoin Networks
8 Multiprocessors
Appendix A: Computer Arithmetic by David Goldberg Xerox Palo Alto Research Center
Appendix B: Vector Processors
Appendix C: Survey of RISC Architectures
Appendix D: An Alternative to RISC: the Intel 80x86
Appendix E: Implementing Coherence Protocols