The story of computing's development is as fascinating as anything in history. In just more than 50 years, we have gone from some sketchy ideas and concepts to a world in which the number of computing devices is reckoned in the hundreds of millions and growing fast.
* Early computing devices
The concept of a mechanical calculator dates to the Sixteenth Century, and was realized in fits and starts in various ways over the succeeding centuries. By the end of the 1800s, companies were producing devices that were sophisticated and reliable. But, even as businesses and scientists came to rely on these machines, it was obvious that the use of gears and levers would always limit their functionality. Most important, it wasn't practically possible to create mechanical devices that could be programmed. You could add a long list of numbers to get a result, then divide that result, etc. But you couldn't tell a mechanical device to add some numbers, compare the result to some other number, then either divide or multiply depending on the outcome of the comparison. You had to have people make the intermediate decisions, which meant that the operations were invariably slow and error-prone.
The idea of an electronic computer surfaced not long after the appearance of electronics. It seemed clear to creative people (this is in retrospect, of course) that the vacuum tube, or more accurately collections of vacuum tubes, could do what mechanical devices couldn't—temporarily store the results of calculations and instructions about new ones. The actual realization of an electronic computer occurred in an American university, though which one is the subject of intense debate.
* The University of Pennsylvania vs. Iowa State
The University of Pennsylvania vs. Iowa State—not a football game, but a controversy over where the first electronic computer was developed. While most sources credit the first electronic computer, ENIAC (1946), to two University of Pennsylvania researchers, John Mauchly and J. Presper Eckert, there is strong evidence that the first machine was actually built in 1942 by a professor at Iowa State, John V. Atanasoff.
Whatever the verdict, we know that the American university has been a critical part of the initial and continuing development of computing. The first computer was built in an American university, and the first computer company was a direct university spinoff. Universities soon created an entirely new discipline to support the fast growing industry, and their classrooms and labs supplied the educated people as well as much of the actual knowledge that has driven an extraordinary pace of change. The university role, however, has always been in partnership with business and government.
* The partnership of IBM and the Department of Defense
IBM was already a mature and well-established company when the first electronic computer was created. Under the leadership of the indomitable Thomas J. Watson, Sr., IBM had taken a position of leadership in office machinery, including devices that performed calculations. The company was caught flat-footed by the advent of the electronic computer, but Watson quickly divined its importance and made an all out effort to secure leadership. There were several ingredients to IBM's rapid success, but none is more important than its understanding of the kind of businesses that would use computers. Even though the technology was new and fragile, IBM appreciated that potential customers, the kind that could afford computers, were not interested in new technology or experimentation. In fact, they were generally risk-averse. A typical IBM customer was a utility company that needed some way to deal with the huge task of calculating, printing, and reconciling its customers' bills. To companies like these, the computer presented an enormous opportunity. As the nation's population grew, and as society experienced a new level of prosperity, the challenge of hiring and housing a vast army of clerks was increasingly difficult—perhaps at some point impossible. On the other hand, the computer was a danger. If just one billing cycle was screwed up, the company would face a disaster of enormous proportions. The company would survive—people would still need electricity—but the executives surely wouldn't. IBM understood this environment perfectly, and provided systems that were amazingly secure and stable given the precariousness of the technology. This ability allowed IBM to dominate its many competitors.
Another dimension of IBM's relationship with its large and conservative customers was that there was no need for dramatic improvements in the technology. If IBM could regularly provide more for less, which was easy to do with a technology that was in its earliest stages, its customers were satisfied. Competitors could and did offer more, but their chronic inability to provide IBM's rock solid reliability and service kept them at the margins. Progress in computing might have continued at a glacial pace were it not for the Department of Defense, which was far more interested in seeing rapid advances incorporated into weaponry and related systems. The Pentagon liked IBM as a partner for the same reason as did the large corporations, and IBM obliged in advancing technology by building, in partnership with US universities, a very strong research capability. As a result, whenever competition forced IBM to pick up the pace on the commercial side, it was ready with something from the lab. This cozy relationship continued until it was broken by the accelerating pace of technological development. To understand why this happened, we need to review the development of computer "generations."
Five generations of computing
Counting computer generations is necessarily controversial—machines don't have the same pedigrees as people. But, the five generations that are described here comprise close to a consensus. We'll characterize them briefly, then discuss the fundamental changes in economics that have resulted.
* The mainframe
The structure of the mainframe hasn't changed a lot since the earliest computers. Its primary characteristic is that all intelligence (computing power), as well as all storage of data and programs, is at the center; kept in the cabinet or cabinets that are the main frame(s). Users get access to the mainframe's intelligence and resources through terminals—dumb devices that are little more than a keyboard and a display.
* The minicomputer
This is probably the most controversial of the generations since it is just a variation on the mainframe—the same basic centralized organization just on a smaller scale and with lower production costs. The reason that the mini is described as a generation is that its lower prices sharply increased access to computing beyond the large corporations that could afford mainframes.
* The microcomputer
The microcomputer, generally synonymous with the personal computer, really is a generation since it offers a dramatic contrast to its predecessors. Where in the past individual users all shared the resources of a single machine, now a single user had direct and personal access to significant computing power as well as to stored programs and data. Eventually, this contrast was blurred as single-user machines were connected with each other over local area networks (LANs), which were then often connected back to mainframes and minis. But, even when all computer users were linked to the big systems, the relationship was fundamentally different. Now, the user had a great deal of independence, and could share with others only as desired, not as required.
* The Internet and the Web
The advent of the Internet provided another dramatic shift. Suddenly, there was an all-enveloping network that meant that all users could connect to all computers. Where the evolution of the previous generation had moved toward aggregation of resources, but only within a defined group—the corporation, organization, or service such as CompuServe— now the Internet, and its graphical offspring the World Wide Web, meant that there was a universal link. With relatively trivial effort, now everyone everywhere could exchange information.
* Pervasive computing
The fundamental shift in pervasive computing is away from the desktop. As advances in technology make possible devices that are both smaller and smarter, we no longer have to sit in a chair and look at a monitor to use a computer. The components of pervasive computing are a very diverse and rapidly evolving group—cell phones, personal digital assistants (like the PalmPilot), television set-top boxes, the control systems of automobiles, and more. Like the microcomputer, these devices were originally independent, but there is tremendous momentum behind the effort to get them to talk to each other, and to the world of machines on the Web, seamlessly and effortlessly.
* The changing economics of computing
The economics of computing have changed with the technology. In the mainframe and minicomputer generations, because hardware was extremely expensive, programmers had to focus on the most efficient use of system resources. This meant that software development was quite conservative. Certainly, there was innovation, but it appeared in a steady, almost predictable stream.
The appearance of the microcomputer changed the dynamics in a fundamental way. The availability of cheap hardware meant that the number of computers expanded from the hundreds of thousands to hundreds of millions. This necessarily wrought an explosion in software. The first wave offered an enormous variety of choices in the basics—the operating systems that manage the computer's work and the core "productivity applications" such as word processing, databases, and spreadsheets. After a time this part of software stabilized, largely under the hegemony of Microsoft, but the shift to the Internet and the Web has produced a second wave of software that looks beyond a single desktop to all of the computers in the world. Fueled by the fact that hardware just gets cheaper and cheaper, allowing the connection of more computers holding more information, this development is changing not just the economics of computing, but the economics of society.
Objectives of the Essential Guide to Computing (EGC)
As I described this book project to friends and colleagues, a typical reaction was something like, "It can't be done. Things change too fast to be captured in a book. It will be out of date before it's printed." The part about things changing fast is certainly true. In the three to four months since the last edits were completed on the manuscript and until the first printed copies appear, some of the technologies mentioned will be on their way to obsolescence, and other new and exciting ones will appear. But, if the world of computing was really changing too fast to understand, the knowledge base of the people who provide the engines of innovation would be too small to sustain the rate of change. In fact, the number of people who really understand the full sweep of issues in computing and telecommunications is very small. If you read this book you will have a breadth of knowledge that is very rare.
And, whatever the critics say, it is possible to catch the train of technology and climb aboard. If you view this book as a reference to all that is new and current, you will be disappointed. For that, you need the Web, newspapers, and magazines. The real question is how do you get the foundation of knowledge that allows you to understand what the media are saying about technology—not just comprehend it, but put it in the perspective needed for employment, education, or investing? The situation for the average person with technology today would be analogous to that of an untrained person suddenly placed in a football game as a coach (the first and last sports analogy in the book, I promise). This coach doesn't know the rules, much less have any sense of how to develop a strategy, and things are changing so fast that he can't infer them from watching the game. To maintain the analogy then, the purpose of this book is to put you in the stands, and give you a rule book and a TV for instant replay (if I could choose, I would like the comparison to be with John Madden's analysis). This experience should give you the knowledge and perspective you need to be a coach, a referee, or even a player.
Companion Web Site
There are a variety of paths you can pursue when you complete the EGC. If you only want to continue to be far more informed than all but a handful of people, you simply need to use the Web, the newspapers, and magazines to keep current. To make this easier, we've provided a Web site, www.prenhall.com/walters, that both provides direct information and offers links to some of the best sources for breaking knowledge (the Web site also includes answers to the questions provided at the end of each chapter). Alternatively, if you want to go deeper, this book is a foundation for more focused study. To learn in depth, you really (still) need books, and the Suggestions for Further Reading includes an array of choices. The EGC Web site updates these on a regular basis. Needless to say, there are a variety of other directions you could choose. Whatever your decision, I hope this volume launches you in a productive and pleasant direction.
The Essential Guide to Computing: The Story of Information Technology (Essential Guide Series)
New Trade Paper
E. Garrison Walters
0 stars -
Prentice Hall PTR -
Perfect for anyone who needs a basic understanding of how computers work, this introductory guide gives friendly, accessible, up-to-date explanations of computer hardware, software, networks, and the Internet. Coverage also includes micro-processors, operating systems, programming languages, applications, and e-commerce.
by Hold All,
The complete, easy-to-understand guide to IT—now and in the future!
Computers, networks, and pervasive computing
Hardware, operating systems, and software
How networks work: LANs, WANs, and the Internet
E-business, the Web, and security
The guide for ANYONE who needs to understand the key technologies driving today's economy and high tech industries!
You can't afford not to understand the information revolution that's sweeping the world-but who's got time for all the acronyms and hype most technology books give you? The Essential Guide to Computing demystifies the digital society we live in with an intelligent, thorough, and up-to-date explanation of computer, networking, and Internet technologies. It's perfect for smart professionals who want to get up to speed, but don't have computer science or engineering degrees! You'll find up-to-the-minute coverage on all of today's hottest technologies including:
The evolution of computing: from the room-sized "monoliths" of the 1950s to today's global Internet
Preview of the next revolution: "pervasive computing"
Computer hardware: microprocessors, memory, storage, I/O, displays, and architecture
Windows, Macintosh, UNIX/Linux, DOS, NetWare, Palm: what operating systems do, and how they compare
Programming languages: from machine language to advanced object-oriented technologies
Key software applications: databases, spreadsheets, word processing, voice recognition, and beyond
Microsoft and the software industry: where they stand, where they're headed
How networks work: LANs, WANs, packet switching, hardware, media, and more
The Internet, e-commerce, and security
Enterprise applications: data warehousing, Web-centered development, and groupware
Whether you're a consumer, investor, marketer, or executive, this is your start-to-finish briefing on the information technologies that have changed the world-and the coming technologies that will transform it yet again!
Powell's City of Books is an independent bookstore in Portland, Oregon, that fills a whole city block with more than a million new, used, and out of print books. Shop those shelves — plus literally millions more books, DVDs, and eBooks — here at Powells.com.