“You have 18 months”
The real deadline isn’t when AI outsmarts us — it’s when we stop using our own minds.
In fitness, there is a concept called “time under tension.” Take a simple squat, where you hold a weight and lower your hips from a standing position. With the same weight, a person can do a squat in two seconds or 10 seconds. The latter is harder, but it also builds more muscle. More time is more tension; more pain is more gain.
Thinking benefits from a similar principle of “time under tension.” It is the ability to sit patiently with a group of barely connected or disconnected ideas that allows a thinker to braid them together into something that is combinatorially new. It’s very difficult to defend this idea by describing other people’s thought processes, so I’ll describe my own.
A few weeks ago, The Argument Editor-in-Chief Jerusalem Demsas asked me to write an essay about the claim that AI systems would take all of our jobs within 18 months. My initial reaction was … no?
The prediction is so stupendously aggressive and almost certainly wrong, so my instinct was there was really nothing more to say on the subject. Certainly not 1,799 words more. But as I sat with the prompt, several pieces of a puzzle began to slide together: a Financial Times essay I’d read, an Atlantic article I liked, a National Assessment of Educational Progress study I’d saved in a tab, an interview with Cal Newport I’d recorded, a Walter Ong book I was encouraged to read, a stray thought I’d had in the gym recently while trying out eccentric pullups for the first time about how time multiplies both pain and gain in fitness settings. The contours of a framework came into view.
The problem of the next 18 months isn’t AI disemploying all workers, or students losing competition after competition to nonhuman agents. The problem is whether we will degrade our own capabilities in the presence of new machines. We are so fixated on how technology will outskill us that we miss the many ways that we can deskill ourselves.
You have 18 months.
That’s the message from several leading AI executives and thinkers about how long people will retain their advantage over artificial intelligence in the workforce. By the summer of 2027, the story goes, AI’s explosion in capabilities will leave carbon-based life forms in the dust. Up to “half of all entry-level white-collar jobs” will be wiped out, and even Nobel Prize-worthy minds will cower in fear that AI’s architects will have built a “country of geniuses in a datacenter.”
This doomsday clock seems true enough to many people, because the question I’ve fielded more than any other from parents in the last few months is some version of: “If AI is about to be better than us at everything, what should my children do?” If generative AI is better at coding, diagnosing, and problem-solving than any software programmer, radiologist, or mathematician, then even the traditionally “safe” majors like computer science, medicine, and math could be anything but safe.
I understand the anxiety behind the question, but rather than try to forecast the future as it might turn out, I’d prefer to describe reality as it already exists. While we have no idea how AI might make working people obsolete at some imaginary date, we can already see how technology is affecting our capacity to think deeply right now. And I am much more concerned about the decline of thinking people than I am about the rise of thinking machines.
The end of writing, the end of reading
In March, New York Magazine published the sort of cover story that goes instantly viral, not because of its shock value, but, quite the opposite, because it loudly proclaimed what most people were already thinking: Everybody is using AI to cheat in school.
By allowing high-school and college students to summon into existence any essay on any topic, large language models have created an existential crisis for teachers trying to evaluate their students’ ability to actually write. “College is just how well I can use ChatGPT at this point,” one student told New York Magazine. “Massive numbers of students are going to emerge from university with degrees, and into the workforce, who are essentially illiterate,” a professor echoed.
The demise of writing matters because writing is not a second thing that happens after thinking. The act of writing is an act of thinking. This is as true for professionals as it is for students. In “Writing is thinking,” an editorial in Nature, the authors argued that “outsourcing the entire writing process to LLMs” deprives scientists of the important work of understanding what they’ve discovered and why it matters.
Students, scientists, and anyone else who lets AI do the writing for them will find their screens full of words and their minds emptied of thought.
As writing skills have declined, reading has declined even more. “Most of our students are functionally illiterate,” a pseudonymous college professor using the name Hilarius Bookbinder wrote in a March Substack essay on the state of college campuses. “This is not a joke.” Nor is it hyperbole.
Achievement scores in literacy and numeracy are declining across the West for the first time in decades, leading the Financial Times reporter John Burn-Murdoch to wonder if humans have “passed peak brain power” at the very moment that we are building machines to think for us. In the U.S., the so-called Nation’s Report Card, published by the NAEP, recently found that average reading scores hit a 32-year low in 2024 — which is troubling, since the data series only goes back 32 years.
Of course, Americans are reading words all the time: email, texts, social media newsfeeds, subtitles on Netflix shows. But these words live in writing fragments that hardly require any kind of sustained focus necessary to make sense of a larger text. Indeed, Americans in the digital age don’t seem interested in or capable of sitting with anything longer than a tweet. The share of Americans overall who say they read books for leisure has declined by nearly 40% since the 2000s.
Even America’s highest-performing students have essentially stopped reading anything longer than a paragraph. Last year, The Atlantic’s Rose Horowitch reported that students are matriculating into America’s most-elite colleges without having ever read a full book for school. “Daniel Shore, the chair of Georgetown’s English department, told me that his students have trouble staying focused on even a sonnet,” Horowitch wrote.
Nat Malkus, an education researcher at the American Enterprise Institute, suggested to me that high schools have chunkified books to prepare students for the reading-comprehension sections of standardized exams. By optimizing the assessment of reading skills, the U.S. education system appears to have accidentally killed book reading.
The decline of writing and reading matters because writing and reading are the twin pillars of deep thinking, according to Cal Newport, a computer science professor and the author of several bestselling books, including Deep Work. The modern economy prizes the sort of symbolic logic and systems thinking for which deep reading and writing are the best practice.
AI is “the latest in multiple heavyweight entrances into the prize fight against our ability to actually think,” Newport said. The rise of TV corresponded with the decline in per capita newspaper subscriptions and a slow demise of reading for pleasure. Then along came the internet, followed by social media, the smartphone, and streaming TV.
“The one-two punch of reading and writing is like the serum we have to take in a superhero comic book to gain the superpower of deep symbolic thinking,” Newport said. “And so I have been ringing this alarm bell that we have to keep taking the serum.”
Newport’s warning echoes an observation made by the scholar Walter Ong in his book “Orality and Literacy.” According to Ong, literacy is no passing skill. It was a means of restructuring human thought and knowledge to create space for complex ideas.
Stories can be memorized by people who cannot read or write. But nothing as advanced as, say, Newton’s “Principia” could be passed down from generation to generation without the ability to write down calculus formulas. Oral dialects commonly have only a few thousand words, while “the grapholect known as standard English has … at least a million and a half words,” Ong wrote. If reading and writing “rewired” the logic engine of the human brain, the decline of reading and writing are unwiring our cognitive superpower at the very moment that a greater machine appears to be on the horizon.
So what should our children study in an age of thinking machines? While I don’t know what field any particular student should major in, I do feel strongly about what skill they should value: It’s the very same skill that I see in decline. It’s the patience to read long and complex texts; to hold conflicting ideas in our heads and enjoy their dissonance; to engage in hand-to-hand combat at the sentence level within a piece of writing — and to value these things at a time when valuing them is a choice, because video entertainment is replacing reading and ChatGPT essays are replacing writing. As AI becomes abundant, there is a clear and present threat that deep human thinking will become scarce.
This summer I grappled with the coming of AI a lot. My daughter is in 5th grade and she is already thinking about what she wants to do when she grows up and I had a lot of conversations with her about how the world may be dramatically different for her in 8-10 years. I feel like the best thing I can do is prepare her to be resilient and adaptable. Understand that it is important to learn skills and continue her mental growth, but also have a very open mind to the possibilities available to her.
What’s funny from my perspective is I have ADD, I’ve had it all my life. I can’t get through a book either, but I listen to a ton of podcasts watch a lot of documentaries. When I use AI, I write it myself first and I just punch it into AI to clean it up. I actually find it really freeing to be able to get my thoughts out and say what I want to say and not have to worry about whether I’ve made the structure 100% right. It’s still my thoughts. It’s still what I wanted to say and how I wanted to say it. The AI just helps make sure that I say it clearly. Or sometimes with character limits it helps me say what I want to say in the correct number of characters. I don’t know. I’m sure there are people that just have it do all the work, but if you use it correctly, it’s actually a fantastic tool.