It is tempting when writing a book review to treat the book as a moment for reflection: on the author, the topic, the cultural moment. But when the book in question’s central claim is that we are all going to die horrible deaths if tech companies succeed at their current plans, the only question that matters is whether or not the book is correct about that claim.
“If Anyone Builds It, Everyone Dies,” a new book by Eliezer Yudkowsky and Nate Soares, argues that OpenAI and other major AI companies are, right now, trying to build AI that is smarter than humans at everything — and if they succeed, it will mean the end of life on Earth.
Yudkowsky, once an artificial intelligence researcher and credited as an inspiration by many of his friends — and also surprisingly many of his enemies — has been arguing this point for almost two decades now. Soares, the president of the Machine Intelligence Research Institute, which Yudkowsky founded, did about half the work of turning Yudkowsky’s polarizing style and extraordinary verbosity into a readable work for a popular audience.1
There is a lot you can say about this book, to put it mildly.2 There is a lot you can say about Eliezer Yudkowsky as a person.3 There is even more you can say about the AI-world politics that led to CEOs claiming that they’re trying to build artificial superintelligence in the first place, and the changing AI-world politics that led them to recently shut up about it (without much changing their research programs).
But when we’re talking about whether or not we’re hurtling toward mass human extinction, I don’t think any of these topics are really worth my — or your — time.
The thing that a number of CEOs have told us they intend to do — build a “superintelligence” that surpasses humanity in every way — is, in fact, ludicrously dangerous. That they are mostly escaping accountability is precisely because it is so ludicrous that people don’t take it seriously enough to process it as dangerous.
Keep reading with a 7-day free trial
Subscribe to The Argument to keep reading this post and get 7 days of free access to the full post archives.