As you watch your classmates use AI for every corner of their lives—summarizing, annotating, drafting, thinking—you may feel a specific kind of demoralization set in. A sinking question forms: What does anything matter anymore? Is life now just a game of cheating the system efficiently? Is this where all the breathless hype about “the future” has landed us—an economy of shortcuts and plausible fraud?
High school student Ashanty Rosario feels this acutely. She gives voice to the heartbreak in her essay “I’m a High Schooler. AI Is Demolishing My Education,” a lament not about laziness but about loss. She doesn’t want to cheat. But the tools are everywhere, glowing like emergency exit signs in a burning building. Some temptations, she understands, are structural.
Her Exhibit A is devastating. A classmate uses ChatGPT to annotate Narrative of the Life of Frederick Douglass. These annotations—supposed proof of engaged reading—are nothing more than copy-paste edu-lard: high in calories, low in nutrition, and utterly empty of struggle. The form is there. The thinking is not.
Rosario’s frustration echoes a moment from my own classroom. On the last day of the semester, one of my brightest students sat in my office and casually admitted that he uses ChatGPT to summarize all his reading. His father is a professor. He wakes up at five for soccer practice. He takes business calculus for fun. This is not a slacker. This is a time-management pragmatist surviving the twenty-first century. He reads the summaries, synthesizes the ideas, and writes excellent essays. Of course I wish he spent slow hours wrestling with books—but he is not living in 1954. He is living in a culture where time is scarce and AI functions as an oxygen mask.
My daughters and their classmates face the same dilemma with Macbeth. Shakespeare’s language might as well be Martian for a generation raised on TikTok compression and dopamine drip-feeds. They watch film adaptations. They use AI to decode plot points so they can answer study questions without sounding like they slept through the Renaissance. Purists howl that this is cheating. But as a writing instructor, I suspect teachers benefit from students who at least know what’s happening—even if the knowledge arrives via chatbot. Expecting a fifteen-year-old to read Macbeth cold is like assigning tensor calculus to a preschooler. They haven’t done their priors. So AI becomes a prosthetic. A flotation device. A translation machine dropped into classrooms years overdue.
Blaming AI for educational decline is tempting—but it’s also lazy. We live in a society where reading is a luxury good and the leisure class quietly guards the gates.
In the 1970s, I graduated from a public high school with literacy skills so thin you could read the room through them. I took remedial English my freshman year of college. If I were a student today, dropped into 2025 with those same deficits, I would absolutely lean on AI just to keep my head above water. The difference now is scale. Today’s students aren’t just supplementing—they’re optimizing. They tell me this openly. Over ninety percent of my students use AI because their skills don’t match the workload and because everyone else is doing it. This isn’t a moral collapse. It’s an arms race of survival.
Still, Rosario is right about the aftermath. “AI has softened the consequences of procrastination,” she writes, “and led many students to avoid doing any work at all. There is little intensity anymore.” When thinking becomes optional, students drift into algorithmic sleepwalking. They outsource cognition until they resemble NPCs in a glitching video game—avatars performing the motions of thought without the effort. My colleagues and I see it every semester: the fade-out, the disengagement, the slow zombification.
Colleges are scrambling. Should we police AI with plagiarism detectors? Ban laptops? Force students to write essays in blue books under watchful eyes like parolees in a literary halfway house? Should we pretend the flood can be held back with a beach towel?
Reading Rosario’s complaint about “cookie-cutter AI arguments,” I thought of my lone visit to Applebee’s in the early 2000s. The menu photos promised ambrosia. The food tasted like something engineered in a lab to be technically edible yet spiritually vacant. Applebee’s was AI before AI—an assembly line of flavorless simulacra. Humanity has always gravitated toward the easy, the prepackaged, the frictionless. AI didn’t invent mediocrity. It just handed it a megaphone.
Rosario is no Applebee’s soul. She’s Michelin-level in a world eager to microwave Hot Pockets. Of course her heart sinks when classmates settle for fast-food literacy. I want to tell her this: had she been in high school in the 1970s, she would have witnessed the same hunger for shortcuts. The tools would be clumsier. The prose less polished. But the gravitational pull would be identical. The urge to bypass difficulty is not technological—it’s ancestral.
What’s new is scale and speed. In the AI age, that ancient hunger is supercharged by what I call the Mediocrity Amplification Effect: the phenomenon by which AI accelerates and magnifies our long-standing temptation to dilute effort and settle for the minimally sufficient. Under this effect, tools meant to assist learning become accelerants of shortcut culture. Procrastination carries fewer consequences. Intensity drains away. Thinking becomes optional.
This is not a new moral failure. It is an old one, industrialized—private compromise transformed into public default, mediocrity polished, normalized, and broadcast at scale. AI doesn’t make us lazy. It makes laziness louder.

Leave a comment