When I was in graduate school at Berkeley I was offered a prestigious fellowship to study for a year in Germany, but I decided it would be a disruption, so I wrote a short note declining the offer. As, letter in hand, I stepped to the mailbox, I bumped into a woman from the scholarship office. "Don't mail that yet," she told me. "At least talk to your advisor first."
I did talk to my advisor, and he convinced me to accept. That year abroad led to a faculty position at Caltech, and had an enormous effect on my attitudes, and my personal life.
My walk to campus that morning had taken 20 minutes. Had I encountered one more stoplight or one less, had I had a few more sips of coffee, or a few less, my path would not have crossed that woman's, and I would be a different person than I am today.
One of the themes of my book The Drunkard's Walk was that we can all find seemingly minor events in our past that greatly altered the trajectory of our lives. In my new book, The Upright Thinkers, I talk about the relentless march of science and how, over time, our insatiable curiosity as a species made progress inevitable. But as I did research for the book, I was also struck by the role that randomness played in the history of science. I realized, in fact, that the path to the greatest scientific breakthroughs, and hence the technological and social changes they wrought, were as sensitive to the chance constellation of events as was my encounter at the mailbox that day.
Take the science of Isaac Newton. His book Principia remains the most influential work in the history of science, a book that reshaped human thought, and was the culmination of work he had done over many decades. But in 1684, at age 41, Newton was working principally on alchemy and religion, and all he had to show for his life's work was a pile of disorganized notes and essays on those subjects, an office littered with unfinished mathematical treatises, and a theory of motion that was still confused and incomplete. What altered that?
Astronomer Edmond Halley — of comet fame — found himself in Cambridge, and decided to look up the solitary Professor Newton to ask him a technical mathematical question about the motion of the planets: What kind of force would cause their orbits to be ellipses?
Newton was not one to relish visitors, but he met with Halley, and soon afterward prepared a nine-page paper answering the question. In the process he became obsessed with the laws of motion, and for the next 18 months he hardly slept or ate while he worked feverishly on the issue.
That work is what culminated in the Principia. One cannot predict what course science and technology would have taken had Newton's theories remained unfinished and unpublished, but it is safe to say that had Halley and Newton not met on that summer day in Cambridge, the world today would look far different.
The development of quantum theory, too, depends on a series of chance events. Quantum theory has led to the enormous and still accelerating changes that we are experiencing in today's society — computers, cell phones, televisions, lasers, the Internet, medical imaging, genetic mapping, and most of the new technologies that have revolutionized modern life. It is a theory that was developed by many scientists over several decades, but the key breakthrough was due to Werner Heisenberg (Austrian Erwin Schrödinger also formulated a version of quantum theory, but his work had been inspired by Heisenberg's).
As a young student, Heisenberg had decided to pursue a doctorate in mathematics. To be accepted into a program required convincing a faculty member to sponsor you, and the only interview Heisenberg managed to obtain was with a well-known mathematician named Ferdinand von Lindemann, at the University of Munich. It was only after Lindemann turned him down that Heisenberg decided to go into physics.
Before Heisenberg, quantum physics was in a confused state, and its practitioners were so frustrated that at least one of them publicly stated regret about having become a physicist. Without Heisenberg's insights, research in the area would have continued full force, but not for long: a few years later Hitler came to power and issued decrees that threw German science, the heart of research into quantum theory, into disarray. And so, had Lindemann been more impressed with Heisenberg, the development of quantum theory might have been much delayed, and our world today might not have progressed much beyond what you see in an episode of Mad Men.
I've mentioned two examples from physics, but fragility of grand discoveries can be seen everywhere. Charles Darwin, for example, was not Captain Robert Fitzroy's first choice for the position of gentleman-naturalist aboard the Beagle, but his first choice declined. Even then, Fitzroy had grave doubts about accepting Darwin, because he believed that Darwin's nose indicated bad character. But it was not a paying position, so applicants were difficult to find, and Darwin got the job. Had he not, Darwin would not have created his theory of evolution. And though Alfred Russel Wallace developed a similar theory, he collected relatively little evidence to support it, and would have had far less impact on science and society than Darwin did.
Chemistry's greatest advance, the periodic table, also came when it did because of a series of fortunate events. The key one was that Dmitri Mendeleev, assigned to teach a course on the elements, could not find a textbook he deemed satisfactory. And so he did what professors everywhere still do today. He decided to write his own. That raised in his mind an interesting question. How do you organize a text on the elements? That depends on how the elements are related, and the rest (after a lot of hard work) is history.
I don't mean to say that, had these chance events not occurred, the periodic table — or the other advances I've mentioned — would not have eventually been made by others. They probably would have been — though probably in some different form. My point is that, had that happened, science, and hence civilization, would have proceeded along a quite different course and timetable.
We think of the process of practicing science as the ultimate systematization of human thought and reason. But the image of a crowd of individuals groping around in the dark, pushed by random collision into wholly new directions, is not a bad metaphor, either. It's something useful to keep in mind as we try to plan and control our personal lives, to beat the stock market, or, say, to channel product innovation and build business plans. Science teaches us a lot about the order of the universe, but it also teaches us that, despite our best efforts, it is the small chance occurrences that often count the most.