From an article on NYRB on IA costs and its shortcomings:
As Kate Crawford’s trenchant Atlas of AI demonstrates again and again, artificial intelligence does not come to us as a deus ex machina but, rather, through a number of dehumanizing extractive practices, of which most of us are unaware. Crawford, a senior researcher at Microsoft and a cofounder of the AI Now Institute at NYU, begins her tour of the AI universe in Silver Peak, Nevada, looking at the “open, iridescent green ponds” of brine pumped out of North America’s largest lithium mine. Lithium—the “li” in “li-ion” batteries—is an essential ingredient in our digital lives. Without it there are no laptop computers, no smart watches, no cell phones.
“The term ‘artificial intelligence’ may invoke ideas of algorithms, data, and cloud architectures,” Crawford writes, “but none of that can function without the minerals and resources that build computing’s core components.” She adds:
Many aspects of modern life have been moved to “the cloud” with little consideration of these material costs. Our work and personal lives, our medical histories, our leisure time, our entertainment, our political interests—all of this takes place in the world of networked computing architectures that we tap into from devices we hold in one hand, with lithium at their core.
Calling those networked computers “the cloud” is a perfect example of what Crawford sees as “the strategic amnesia that accompanies stories of technological progress.” While the metaphor invokes an image of data floating weightlessly in the sky, the reality is that the cloud takes up hundreds of thousands of acres of terrestrial real estate, typically located where electricity is cheap. (The world’s largest data center, as of 2018, in Langfang, China, covers 6.3 million square feet, the equivalent of 110 football fields.) Cheap, of course, is a relative term. A study from researchers at McMaster University found that, if unchecked, the computing industry as a whole could account for 14 percent of all greenhouse emissions by 2040—“about half of the entire transportation sector worldwide.”
Some of this carbon intensity has been driven by the belief that ever-bigger datasets are essential to train machine learning algorithms in order to create workable AI systems. (Machine learning is a kind of artificial intelligence, in which algorithms sort through enormous amounts of data using statistical methods to make classifications and predictions; the assumption is that more data delivers more accurate outcomes.) When researchers from the University of Massachusetts Amherst calculated the carbon emissions required to build and train a single natural language processing system—which teaches computers to interpret and use everyday language—they determined that it was around five times the lifetime emissions of the average American car.
Comments