In his bracing essay “Colleges Are Preparing to Self-Lobotomize,” Michael Clune accuses higher education of handling AI with the institutional equivalent of a drunk chainsaw. The subtitle gives away the game: “The skills that students will need in an age of automation are precisely those that are eroding by inserting AI into the educational process.” Colleges, Clune argues, spent the first three years of generative AI sitting on their hands. Now they are overcorrecting in a frenzy, embedding AI everywhere as if saturation were the same thing as competence. It isn’t. It’s panic dressed up as innovation.
The prevailing assumption seems to be that if AI is everywhere, mastery will somehow emerge by osmosis. But what’s actually happening is the opposite. Colleges are training students to rely on frictionless services while neglecting the very capacities that make AI usable in any meaningful way: critical thinking, the ability to learn new things, and flexible modes of analysis. The tools are getting smarter; the users are getting thinner.
Clune faces a genuine rhetorical problem. He keeps insisting that we need abstractions—critical thinking, intellectual flexibility, judgment—but we live in a culture that has been trained to distrust anything abstract. Telling a screen-saturated society to imagine thinking outside screens is like telling a fish to imagine life outside water. The first task, then, is not instruction but translation: What is critical thinking, and how do you sell it to people addicted to immediate, AI-generated results?
The fish analogy holds. A fish is aquatic; water is not a preference but a prison. A young person raised entirely on screens, prompts, and optimization tools treats that ecosystem as reality itself. Like the fish, they are confined to a single environment. We can name this condition precisely. They are cognitively outsourced, trained to delegate thinking to machines as if this were normal or healthy. They are algovorous, endlessly stimulated by algorithms that quietly erode attention and resilience. They are digitally obligate, unable to function without mediation. By definition, these orientations exclude critical thinking. They produce people who are functional inside digital systems and dysfunctional everywhere else.
Drop such a person into a college that recklessly embeds AI into every course in the name of being “future-proof,” and you send them into the job market as a fragile, narrow organism. In some fields, they will be unemployable. Clune points to a telling statistic: history majors currently have an unemployment rate roughly half that of recent computer science graduates. The implication is brutal. Liberal arts training produces adaptability. Coding alone does not. As the New York Times put it in a headline Clune cites, “Goodbye, $165,000 Tech Jobs. Student Coders Seek Work at Chipotle.” AI is replacing coders. A life spent inside a tiny digital ecosystem does not prepare you for a world that mutates.
Is AI the cause of this dysfunction? No. The damage was done long before ChatGPT arrived. I use AI constantly, and I enjoy it. It sharpens my curiosity. It helps me test ideas. It makes me smarter because I am not trapped inside it. I have a life beyond screens. I have read thousands of books. I can zoom in and out—trees and forest—without panic. I have language for my inner life, which means I can diagnose myself when I become maudlin, entropic, dissolute, misanthropic, lugubrious, or vainglorious. I have history, philosophy, and religion as reference points. All of this adds up to what we lazily call “critical thinking,” but what it really means is being fully human.
Someone who has outsourced thought and imagination from childhood cannot suddenly use AI well. They are neither liberated nor empowered. They are brittle, dependent, and easily replaced.
Because I am a lifelong weightlifter, I’ll offer a more concrete analogy. AI is a massive, state-of-the-art gym: barbells, dumbbells, Smith machines, hack squats, leg presses, lat pulldowns, pec decks, cable rows, preacher curls—the works. Now imagine you’ve never trained before. You’re twenty-eight, inspired by Instagram physiques, and vaguely determined to “get in shape.” You walk into this cathedral of iron with no plan, no understanding of hypertrophy, recovery, protein intake, progressive overload, or long-term discipline. You are surrounded by equipment, but you are lost. Within a month, you will quit. You’ll join the annual migration of January optimists who vanish by February, leaving the gym once again to the regulars.
AI is that gym. It will eject most users. Not because it is hostile, but because it demands capacities they never developed. Some people will learn isolated tasks—prompting here, automating there—but only in the way someone learns to push a toaster lever. These tasks should not define a human being. When they do, the result is a Non Player Character: reactive, scripted, interchangeable.
Young people already understand what an NPC is. That’s why they fear becoming one.
If colleges recklessly embed AI into every corner of the curriculum, they are not educating thinkers. They are manufacturing NPCs. And for that, they deserve public shame.









