Everything and More: A Compact History of Infinity
by David Foster Wallace
A review by Polly Shulman
The greatest thrill I remember from my girlhood — better than my first kiss,
first airplane flight, first taste of mango, first circuit around the ice rink
without clinging to a grownup's sleeve — was the heartlifting moment when I
first understood Georg Cantor's Diagonal Proof of the nondenumerability of the
real numbers. This proof, the Mona Lisa of set theory (to my mind, the most satisfying
branch of mathematics), changed the way mathematicians thought about infinity.
If you've ever thought much about numbers or talked with a preschooler learning
to count, you've probably encountered some of the questions that led to Cantor's
discovery a century ago. How many natural numbers are there? (Naturals are just
the numbers we count with: 1, 2, 3, 4 and so on up forever.) And what about
the even naturals: 2, 4, 6, 8 and so on? Infinitely many in both cases, right?
OK, but are there more naturals than evens? Clearly every even natural number
is a natural number, but there are plenty of naturals that aren't even — namely
the odds: 3, 5, 7, 9 and so on. Does that mean that the set of naturals is bigger
than the set of even naturals?
What about the positive rational numbers — fractions like 1/2, 17/23, 15/3?
Are there more of those than the naturals? After all, the rationals include
the naturals, since any natural number can be written as a fraction in lots
of different ways (for example, 2 is 2/1, 4/2, 6/3 and so on). And what about
the real numbers, which include the integers and the rationals, but also weird
numbers like pi and the square root of 2, which can't be written as fractions
— are there more of those? Are there more points on a circle than there are
seconds in eternity? How would you even begin to think about answering such
questions?
Using reasoning that I've always thought of as his Squiggly Argument, Cantor
proved that despite all the apparently extra fractions, there are just as many
natural numbers as there are rational numbers. But don't jump to the conclusion
that all infinities are the same size — the gorgeous Diagonal Proof shows that
there are more real numbers than naturals. A lot more — infinitely many more,
mindbogglingly many more. And Cantor used the same method of argument to prove
that there are not just two sizes of infinity, but mindbogglingly infinitely
many. His work on these infinitely large numbers, called cardinals and ordinals,
raised questions that ultimately shook math to its foundations.
Yet Cantor's diagonal argument, in its essence, is so beautifully simple that
even someone who hasn't yet entirely mastered trigonometry can understand it.
I know, because I did, and so can you, whoever you are. I've often wanted to
share the thrill with my intelligent but mathematically innocent friends and
family — English teachers, textile designers, photo editors, Internet journalists,
soccer moms, wedding guests — and I've succeeded, too, whenever I can get them
to stay put. Being stuck in an elevator together helps. But elevators don't
stall that often, so I was delighted to learn that the gloriously articulate
novelist and essayist David Foster Wallace had written a history of infinity,
with Cantor's Diagonal Proof as its climax. Now at last, I thought, I'll be
able to spread the bliss without being treated like the Ancient Mariner.
Well, not really. Everything and More: A Compact History of Infinity,
though bristling with felicities, isn't for the mathematically timid. (Wallace
doesn't use the word "infinity," but rather the sideways8 symbol
for it, which cannot be replicated in ASCII code.) In fact, it's hard to figure
out just who the book is for. I relished Wallace's passionately erudite tone
and the many exciting mathematical moments he helped me revisit, but there was
nothing in Everything that I hadn't learned as an undergraduate math
major. On the other hand, readers who haven't taken at least two semesters of
calculus will probably have a hard time keeping up, despite Wallace's many protestations
to the contrary. If you enjoyed math but quit after calculus because you didn't
have room in your schedule; if you've forgotten quite a bit over the past few
decades and want a stylish reminder; if you regret having focused on the discipline's
realworld applications and wish you'd paid more attention to its philosophical
issues; or if you're the captain of your middle school math team and have begun
working through your big sister's calculus book because you're bored, then you'll
love this book. Everyone else, try it anyway and prove me wrong. (Go ahead —
this elevator isn't going anywhere.)
Appropriately enough for a book based on paradoxes, Everything's chief
virtues double as drawbacks. Wallace doesn't try to hide the difficulty of his
subject behind realworld examples, as many math writers do — instead, he tackles
it straight on. Unlike science, math isn't really about the real world: It's
about itself, its own assumptions and the conclusions they lead to logically.
The sections of Everything in which Wallace discusses the philosophical
urges and ways of thinking that underlie math are as charming as they are insightful.
Like a true mathematician, he revels in abstraction, including proofs and mathematical
notation where another writer would cop out and use metaphors. Only very occasionally
does he make actual errors or simplify so much that the math looks wrong.
But readers familiar with Wallace may not be surprised to learn that he fetishizes
technical terms to the point of becoming irritating and inconsiderate. (Pretentious?
Lui?) For example, do you know what w/r/t stands for? Well, I'll tell you, since
Wallace doesn't: "with respect to," a phrase that comes up quite a
bit in math classes, and that Wallace sticks into ordinary sentences from time
to time in its abbreviated form. You might be able to guess its meaning from
the context if you didn't already know; but then again you might not, and Wallace
has already tired your brain with so many necessary abbreviations that adding
gratuitous extras seems rude.
Then there's his choice to structure the book as a history of infinity, rather
than a less chronological treatment. Wallace begins with paradoxes that troubled
the ancient Greeks, such as Zeno's brain twisters, including the one about how
you can't cross the street because you first have to go halfway across, then
halfway across the remaining distance, then halfway across the remainder of
that, and so on forever. (You may have encountered the version in which a hare
loses a race to a slower tortoise who had a head start.) Centuries of mathematicians
found infinitely large and infinitely small quantities tempting but problematical
because of these and similar problems. If you know the math but not the history,
it's fascinating to watch whole branches of math (analysis, set theory) grow
out of attempts to avoid or justify using these dubious infinities and infinitesimals
in calculations. But the chronological structure seems to make it hard for Wallace
to leave out any of the background. He's like a hungry man in a grocery store,
piling more and more and more into his 319page book (or "booklet,"
as he wishfully calls it). His appetite is inspiring, but I could have done
with a lot less of the partial differential equations — sticky formulas from
secondyear calculus, sort of like algebra equations that use further, harder
equations instead of the familiar x's and y's — and a lot more of the actual
theory of infinite numbers that begins with Cantor's cardinals and ordinals.
Oddly, Wallace stops with Cantor in the early 20th century. The next exciting
advances in set theory come a few decades later with Gödel's Incompleteness
Theorem. They don't have much to do with infinity per se, and the novelist Rebecca
Goldstein will cover them in another book in the same series anyway (I can't
wait — she's terrific), so I can see why Wallace skimmed lightly over them.
But infinity didn't stop with Cantor. In the 1960s, John Horton Conway, a mathematician
at Princeton, invented (or as he would say, discovered) a new number system,
the "surreal numbers," which takes the study of the infinite far beyond
Cantor's cardinals and ordinals. You can use Cantor's numbers to measure things
and, more or less, to count — but that's about it. But with Conway's surreals,
calculations involving infinitely large and small quantities make precise sense
for the first time. You can add, subtract, multiply, divide and even differentiate
(an operation of calculus) with the surreals — you can use them in algebra
and come up with meaningful answers. And the beauty of it is, they're relatively
easy to understand and explain. Wallace does mention a precursor of the surreals,
the hyperreals, but I was disappointed not to find the surreals themselves in
Wallace's book.
On the whole, though, Wallace does an admirable job unwinding what he calls
"the Story of Infinity's overall dynamic, whereby certain paradoxes give
rise to conceptual advances that can handle those original paradoxes but in
turn give rise to new paradoxes, which then generate further conceptual advances,
and so on." Who needs boymeetsgirl when you can have mindmeetsmindmeetstruth?
Polly
Shulman edits news articles for the journal Science.

