Synopses & Reviews
It’s easy to forget that, for most of its existence, the English word ‘computer’ referred not to machines, but to people who performed calculations. First used in the seventeenth century, the term arrived via French from the Latin
computare, meaning to count or add up.
Computare itself derived from the combination of the words
com, meaning ‘with’, and
putare, which originally meant ‘to prune’ in the sense of trimming something down to size, and which came to imply ‘reckoning’ by analogy with mentally pruning something down to a manageable estimate.
Long before eminent Victorians like Charles Babbage had even dreamed of calculating machines, human computing had been vital to such feats as the ancient Egyptians’ understanding of the motion of the stars and planets, with mathematicians like Ptolemy laboriously determining their paths (he also managed to calculate pi accurately to the equivalent of three decimal places: no mean feat for the first century AD).
As mathematics developed, the opportunities for elaborate and useful calculations increased – not least through the development of tables of logarithms, the first of which were compiled by English mathematician Henry Briggs in 1617. Such tables immensely simplified the complex calculations vital to tasks like navigation and astronomy by providing pre-calculated lists of the ratios between different large numbers – but whose construction required immense feats of human calculation both by mathematicians and increasingly necessary groups of trained assistants.
Even as recently as the Second World War, when Alan Turing and his fellows were establishing the revolutionary foundations of modern computing, the word ‘computers’ still referred to dedicated human teams of experts – like those working around Turing at Bletchley Park in England.
According to the Oxford English Dictionary, it wasn’t until 1946 that the word ‘computer’ itself was used to refer to an ‘automatic electronic device’. This was, of course, only the beginning; and since then both the sense and the compound forms of the word have multiplied vastly. From ‘microcomputers’ to ‘personal computers’ and, more recently, ‘tablet computers’, we live in an age defined by Turing’s digital children.
It’s important to remember, though, just how recently machines surpassed men and women in the computation stakes. As late as the 1960s, teams of hundreds of trained human computers housed in dedicated offices were still being used to produce tables of numbers: a procedure that the first half of the twentieth century saw honed to a fine art, with leading mathematicians specializing inbreaking down complex problems into easily repeatable steps.
It’s a sign of how fast and entirely times have changed since then that human computation is almost forgotten. And yet, in different forms, its principles remain alive in the twenty-first century – not least under the young banner of ‘crowdsourcing’, a word coined in 2006 in an article for Wired magazine by writer Jeff Howe to describe the outsourcing of a task to a large, scattered group of people.
From identifying the contents of complex photographs to answering fuzzy questions or identifying poorly printed words, there remain plenty of tasks in a digital age that people are still better at than electronic computers. We may not call it ‘human computation’ any more, but the tactical deployment of massed brainpower to solve some problems remains more potent than ever.
Review
Slimmer and far more engaging than the average dictionary this volume contains 100 short essays on the lingo abbreviations and turns of phrase that have proliferated in the online world. British technology writer Chatfield catalogs the evolving English language with entries on words such as LOLZ pwn twitterverse netiquette and even the venerable byte. Chatfield's expansive knowledge of pop culture and technology lends intrigue to even an utterance of indifference meh—popularized perhaps by its use on the television show The Simpsons in the 1990s before it became a staple of online chat conversations. Moreover he incorporates the linguistic roots of many terms. Discussing the phenomenon of a digital artifact going viral he begins with the 1890s classification of the biological analogue. Obscure anecdotes and a good sense of humor ensure this volume will charm and inform both tech buffs and those who navigate the digital world with trepidation. The essays are only a few pages long so reading the book straight through is a staccato experience but still pleasurable. (Aug.) " Publishers Weekly Copyright PWxyz, LLC. All rights reserved."