This book explains how it is possible for computers to reason and perceive, thus introducing the field called artificial intelligence. From the book, you learn why the field is important, both as a branch of engineering and as a science.
If you are a computer scientist or an engineer, you will enjoy the book, because it provides a cornucopia of new ideas for representing knowledge, using knowledge, and building practical systems. If you are a psychologist, biologist, linguist, or philosopher, you will enjoy the book because it provides an exciting computational perspective on the mystery of intelligence. The Knowledge You Need
This completely rewritten and updated edition of Artificial Intelligence reflects the revolutionary progress made since the previous edition was published.
Part I is about representing knowledge and about reasoning methods that make use of knowledge. The material covered includes the semantic-net family of representations, describe and match, generate and test, means-ends analysis, problem reduction, basic search, optimal search, adversarial search, rule chaining, the rete algorithm, frame inheritance, topological sorting, constraint propagation, logic, truth maintenance, planning, and cognitive modeling.
Part II is about learning, the sine qua non of intelligence. Some methods involve much reasoning; others just extract regularity from data. The material covered includes near-miss analysis, explanation-based learning, knowledge repair, case recording, version-space convergence, identification-tree construction, neural-net training, perceptron convergence, approximation-net construction, and simulated evolution.
Part III is about visual perception and language understanding. You learn not only about perception and language, but also about ideas that have been a major source of inspiration for people working in other subfields of artificial intelligence. The material covered includes object identification, stereo vision, shape from shading, a glimpse of modern linguistic theory, and transition-tree methods for building practical natural-language interfaces. Special Features of this Edition
- Based on extensive teaching experience
- Semiformal representation and procedure specifications bring the ideas to within a step or two of implementation and highlight unifying themes.
- Application examples provide a glimpse of the ideas at work in real-world systems.
- Powerful ideas and principles are identified and emphasized.
Includes bibliographical references (p. 693-724) and index.
I. REPRESENTATIONS AND METHODS. 1. The Intelligent Computer.
The Field and the Book.
This Book Has Three Parts.
What Artificial Intelligence Can Do.
Criteria for Success.
Background. 2. Semantic Nets and Description Matching.
The Describe-and-Match Method.
The Describe-and-Match Method and Analogy Problems.
The Describe-and-Match Method and Recognition of Abstractions.
Problem Solving and Understanding Knowledge.
Background. 3. Generate and Test, Means-End Analysis, and Problem Reduction.
The Generate-and-Test Method.
The Means-Ends Analysis Method.
The Problem-Reduction Method.
Background. 4. Nets and Basic Search ¥ Nets and Optimal Search.
Heuristically Informed Methods.
Background. 5. Nets and Optimal Search.
The Best PathRedundant Paths.
Background. 6. Trees and Adversarial Search.
Background. 7. Rules and Rule Chaining.
Rule-Based Deduction Systems.
Rule-Based Reaction Systems.
Procedures for Forward and Backward Chaining.
Background. 8. Rules, Substrates, and Cognitive Modeling.
Rule-Based Systems Viewed as Substrate.
Rule-Based Systems Viewed as Models for Human Problem Solving.
Background. 9. Frames and Inheritance.
Frames, Individuals, and Inheritance.
Demon ProceduresFrames, Events, and Inheritance.
Background. 10. Frames and Commonsense.
Examples Using Take Illustrate How Constraints Interact.
Expansion into Primitive Actions.
Background. 11. Numeric Constraints and Propagation.
Propagation of Numbers Through Numeric Constraint Nets.
Propagation of Probability Bounds Through Opinion Nets.
Propagation of Surface Altitudes Through Arrays.
Background. 12. Symbolic Constraints and Propagation.
Propagation of Line Labels through Drawing Junctions.
Propagation of Time-Interval Relations.
Five Points of Methodology.
Background. 13. Logic and Resolution Proof.
Rules of Inference.
Background. 14. Backtracking and Truth Maintenance.
Chronological and Dependency-Directed Backtracking.
Proof by Constraint Propagation.
Background. 15. Planning.
Planning Using If-Add-Delete Operators.
Planning Using Situation Variables.
II. LEARNING AND REGULARITY RECOGNITION. 16. Learning by Analyzing Differences.
Background. 17. Learning by Explaining Experience.
Learning about Why People Act the Way they Do.
Learning about Form and Function.
Background. 18. Learning by Correcting Mistakes.
Isolating Suspicious Relations.
Intelligent Knowledge Repair.
Background. 19. Learning by Recording Cases.
Recording and Retrieving Raw Experience.
Finding Nearest Neighbors.
A Fast Serial Procedure Finds the Nearest Neighbor in Logarithmic Time.
Parallel Hardware Finds Nearest Neighbors Even Faster.
Background. 20. Learning by Managing Multiple Models.
The Version-Space Method.
Background. 21. Learning by Building Identification Trees.
From Data to Identification Trees.
From Trees to Rules.
Background. 22. Learning by Training Neural Nets.
Simulated Neural Nets.
Hill Climbing and Back Propagation.
Background. 23. Learning by Training Perceptrons.
Perceptrons and Perceptron Learning.
What Perceptrons Can and Cannot Do.
Background. 24. Learning by Training Approximation Nets.
Interpolation and Approximation Nets.
Background. 25. Learning by Simulating Evolution.
Survival of the Fittest.
Survival of the Most Diverse.
III. VISION AND LANGUAGE. 26. Recognizing Objects.
Linear Image Combinations.
Establishing Point Correspondence.
Background. 27. Describing Images.
Computing Edge Distance.
Computing Surface Direction.
Background. 28. Expressing Language Constraints.
The Search for an Economical Theory.
The Search for a Universal Theory.
Competence versus Performance.
Background. 29. Responding to Questions and Commands.
Syntactic Transition Nets.
Semantic Transition Trees.
Background. Appendix: Relational Databases.
Relational Databases Consist of Tables Containing Records.
Relations Are Easy to Modify.
Records and Fields Are Easy to Extract.
Relations Are Easy to Combine.