Computing: A Concise History
by Paul E. Ceruzzi
The MIT Press, Cambridge, MA, 2012
175 pp., illus. 14 b/w. e-Book, $11.95, £9.95
ISBN-10:0-262-51767-1; ISBN-13: 978-0-262-51767-6.
Reviewed by Brian Reffin Smith
Collège de ’Pataphysique
There are two descriptions of this book, one on the cover—the title—and one in the introduction. That there is a difference, almost a conflict between them, is an indication that there is more than one way of analysing what has been going on, how, and why, in roughly the last 70 years - a lifetime! - of computation. The title is 'Computing: A Concise History'. But on page xvi of the introduction the book is defined as 'a summary of the development of the digital information age'. Is the history of computing the development of the digital information age, or vice versa? It’s not at all certain. One is content, the other context. Technological determinism haunts such questions.
The words we use in a history of computing are a minefield of uncertainty. The author asserts that the terms 'analog' (US English) and 'digital' were unknown before the late 1930s, but the former word's use, even just in its American spelling, in the English corpus from 1900 to 1910, was more frequent than at any point until nearly 1945; the latter was in use to indicate a number under 10 by about 1450, and interestingly, as a noun, to refer to discrete keys on a piano by 1878. Of course the author means their use in the context of his subject, but a history, even such a concise one, perhaps needs to be a little more open to what might be important semantic underpinnings.
Why would anyone want a history of computing? To what problem or question is this book, largely written in lay-persons’ terms, a solution? Well, I doubt many people directly involved in computing will read it, apart perhaps from a few students, but for many of us the book will provide an interesting and timely overview of the historical context in which changes, and particularly today’s changes, have occurred. Interesting, because of the coverage of the uses to which computers have been put across the ages. Today they are seen as data storage and routing machines but in the early 1980s they were creative tools, whether for business, education, or the arts. Timely because there will be very few more histories of computing: Almost everyone thinks that computing means the social uses of computation and would think the very word ‘computation’ bizarre in the context of Facebook.
There are occasional errors and typos, a particularly comical one on page 4 where a 'not' should surely be a 'now', but on the whole this is a useful little book, let down by a suicidally dour design and an absence of that sine qua non of computer texts, jokes. There is little too about MIT: I remember Nicolas Negroponte coming to the Royal College of Art in 1970s London with a huge Laserdisc under his arm, showing interactive bicycle mending (of course it was really militarily funded: missiles, not bicycles) — there might have been room for the work of his Architecture Machine Group and later the Media Lab at MIT. Joseph Weizenbaum too is absent. Still, at least Ted Nelson gets a line or two.
Finally it is rather ironic that MIT, whose Press is the publisher of this book, is currently at the centre of a row about a possible role in the suicide of Aaron Swartz who was investigated for allegedly trying to access academic papers. The internet, in its initial form as the Arpanet, as this book shows, was never remotely intended to have anything to do with freedom of information. A history of computing is a history of the embodiments of our dreams, and our limitations.