Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence

Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence
by Katye Crawford

Yale University Press, New Haven, CT, 2021
336 pp., illus. 31 b/w. Trade, $28.00
ISBN: 0300209576.

Reviewed by: 
Molly Hankwitz
December 2021

“When…interconnected movements for justice inform how we understand artificial intelligence, different conceptions of planetary politics become possible.” — Kate Crawford.

Atlas of AI is a book that strikes profound chords in the reader. It is both a well-researched example from the artificial intelligence field and an instructional guide on how to think “smartly” about the subject. Kate Crawford’s adroit use of a geographical, planetary map of techno-industrial wastelands situates and problematizes the Big Tech boom. She starts with San Francisco where gold mining in the 19th c created a rush of robber barons and a boom-era of “extraction” that coincides with the excessive milking of everyone’s data from our current interface, the technological “device.” Throughout Crawford’s text, techno-mechanisms that claim an heroic stature are viewed as near exploitation of an underclass of “users” and AI’s seamless efficiency is seen as a heavily privatized, if not secreted backdrop of unseen power, first, in her brief visit to a once-lush, living lake sucked dry by the lust for lithium, a rare earth material required by Tesla to make car batteries or the use of third-world countries as sources of other rare earth material increasingly in demand by tech companies (16). She talks not only lithium, but coal and oil, with the true costs of extraction taking their toll on Malaysian rubber trees and a “Giant artificial lake of toxic residues in Inner Mongolia."

These histories of destruction that energy and trade corporations have wreaked upon many environments are framed within a context of direct blame upon the fairy-tale of endless “economic growth” arguably, a position on environmentalism, industry, and society increasingly at odds with populations who are not benefiting and may certainly be harmed by climate crisis and poverty.

The book is laid out as a research journey beginning in the heart of the western US desert and extending to industrial history in Britain and the ideas of Charles’ Babbage; to scientific management practices of Frederick W. Taylor and Henry Ford; to the archives of the National Institute of Standards and Technology (NIST) and to Google’s Truetime project, an effort to control all time by virtue of individual quantification. Each chapter is a powerful component in an integrated and informative argument about the unseen effects of AI upon people and the shared planet. Crawford indicts the “fourth industrial revolution” from its inside out, undermining re-virtualization via commercial spin around the technology’s “value and benefits”––a first world problem. Instead, her concerns speak for the invisiblized spaces, peoples, and cultures that are inadvertently destroyed in the path of first-world industrial and consumer society.

Chapters are in-depth: Earth, Labor, Data, Classification, Affect, State, Power and Space. Going all the way back to Charles Babbage, moments in technological “development” are likened to the current interest in AI for their similar appeals towards automation, the streamlining of workplaces, and the building of efficiency systems within a regime of social control of human labor. From this perspective, she points to one of the most significant features of AI, “the labor content of the finished product,” that is rendered “largely invisible” by the process as a whole.

From this critique of human v. machine inter-communication upon which AI production rests, Crawford lays out, an incisive groundwork for problematizing the role of the technology and its presentation to culture. Citing Astra Taylor’s commentary, “the kind of efficiency to which techno-evangelists aspire emphasizes standardization, simplification, and speed, not diversity, complexity, and interdependence (Taylor in Crawford, 71) Crawford writes,

“[W]e are living the result of a system [capitalism] in which companies must extract as much value as possible.” These views take us to a detailed section on Google’s TrueTime, the ”process of time coordination at the heart of workplace management” which involves a controlling of bodies. This critique of workplace management is deftly combined with the critique of “extraction” through the lens of data harvesting. The enormous datasets upon which AI has until recently rested, and which were developed by computer scientists in research capacities, are “full of people selfies, of hand gestures, of people driving cars, of babies crying, of newsgroup conversations from the 1990s.” All of these images and bits of data - where anything can be a piece of data - amount to free material ripped off from “users" to improve the performance of algorithms and their functions in “facial recognition, language prediction, and object detection.” She notes that once these datasets become baked into the AI databases, the image itself is rendered meaningless. Gigantic collections of criminal faces without name or identity are being used in this way for facial recognition of “active” criminals.”

No backdoor analysis of AI systems would be complete without discussion of the military past and present of AI having shaped practices of surveillance.

Deep connections between the tech sector and military cybersecurity abound, but not, necessarily in terms of entire computational and analytical systems. Once a novelty of widespread telecommunications broadcast power, the satellites are now nodes in developed webs coating and controlling the globe whether for “mapping”, spying, or transmission. The satellite world of GPS, coupled with huge requirements for “security”, prediction softwares are being widely developed to comprehend the possibility in every act––from missile launches to local crime to shopping desires. “Contemporary systems use labels to predict human identity, commonly using binary genes, essentialized racial categories, and problematic assessments of character and credit worthiness.” Facial expressions, read by AI, can reveal a person’s inner emotional state according to psychologist Paul Eckman, who’s model of universal emotional states ‘read directly from the face’ is being utilized by tech companies in “affect recognition systems” as part of an “industry predicted to be worth more than seventeen billion dollars.”  Crawford then takes on Eckman and his premise.

Thus we arrive at what we have suspected all along. Current AI systems are more a tool of state power than anything else. The long arm of capital has reached once again into our personal lives, each of us like soft-flesh data controlled in a kind of nightmarish future where morsels of human-produced data (from living our lives) is fed continuously into a “diverse” and “widespread” apparatus just as spectacular in its “capability” to render the entire mirror world and its matrix as ever before. Presenting capital and military collusion in the context of a strong “nationalistic” agenda, Crawford goes back to the city of San Francisco as a hub for gold extraction, now a central location to Big Tech companies––Facebook, Twitter, Google, Salesforce––where the dystopia of “work management” and “government control” is all too obvious. Military systems are now the working apparatus of municipal governments, she argues, further blurring lines between states and subjects. As overlapping sectors share the same goals for national interest, AI systems widen existing asymmetries of power. As long as AI remains the “new” tool of military-corporate-industrial complex, one which protects capital and property while it demands exploitation of the third world, there is little hope for social change. The technology is too pervasive; too fast; and too privatized.

Yet Crawford politely leaves the reader with a positive note. She believes that as conditions on Earth change, presumably for the worse due to climate change and worker exploitation by Big Data capital, “calls for data protection, labor rights, climate justice and racial equity” will be heard and heard together. They will become more organized and more overlapped; the mush of disfigured non-shapes that AI once roughly created as ‘art’ will now rise up to assume their most powerful figures yet, and the system will change.

A good read and a must have on the shelf of anyone interested in AI, labor and the environment, the most important crisis of our day.