Category: Education in the AI Age

  • Transactional Transformation Fallacy

    Transactional Transformation Fallacy

    noun

    The Transactional Transformation Fallacy is the belief that personal change can be purchased rather than practiced. It treats growth as a commercial exchange: pay the fee, swipe the card, enroll in the program, and improvement will arrive as a deliverable. Effort becomes optional, discipline a quaint accessory. In this logic, money substitutes for resolve, proximity replaces participation, and the hard interior work of becoming someone else is quietly delegated to a service provider. It is a comforting fantasy, and a profitable one, because it promises results without inconvenience.

    ***

    I once had a student who worked as a personal trainer. She earned decent money, but she disliked the job for reasons that had nothing to do with exercise science and everything to do with human nature. Her clients were not untrained so much as uncommitted. She gave them solid programs, explained the movements, laid out sensible menus, and checked in faithfully. Then she watched them vanish between sessions. They skipped workouts on non-training days. They treated nutrition guidelines as aspirational literature. They arrived at the gym exhaling whiskey and nicotine, their pores broadcasting last night’s bad decisions like a public service announcement. They paid her, showed up once or twice a week, and mistook attendance for effort. Many were lonely. Others liked telling friends they “had a trainer,” as if that phrase itself conferred seriousness, discipline, or physical virtue. They believed that money applied to a problem was the same thing as resolve applied to a life.

    The analogy to college is unavoidable. If a student enters higher education with the same mindset—pay tuition, outsource thinking to AI, submit algorithmically polished assignments, and expect to emerge transformed—they are operating squarely within the Transactional Transformation Fallacy. They imagine education as a vending machine: insert payment, press degree, receive wisdom. Like the Scarecrow awaiting his brain from the Wizard of Oz, they expect character and intelligence to be bestowed rather than built. This fantasy has always haunted consumer culture, but AI supercharges it by making the illusion briefly convincing. The greatest challenge facing higher education in the years ahead will not be cheating per se, but this deeper delusion: the belief that knowledge, discipline, and selfhood can be bought wholesale, without friction, struggle, or sustained effort.

  • Gollumification

    Gollumification

    Gollumification

    noun

    Gollumification names the slow moral and cognitive decay that occurs when a person repeatedly chooses convenience over effort and optimization over growth. It is what happens when tools designed to assist quietly replace the very capacities they were meant to strengthen. Like Tolkien’s Gollum, the subject does not collapse all at once; he withers incrementally, outsourcing judgment, agency, and struggle until what remains is a hunched creature guarding shortcuts and muttering justifications. Gollumification is not a story about evil intentions. It is a story about small evasions practiced daily until the self grows thin, brittle, and dependent.

    ***

    Washington Post writer Joanna Slater reports in “Professors Are Turning to This Old-School Method to Stop AI Use on Exams” that some instructors are abandoning written exams in favor of oral ones, forcing students to demonstrate what they actually know without the benefit of algorithmic ventriloquism. At the University of Wyoming, religious studies professor Catherine Hartmann now seats students in her office and questions them directly, Socratic-style, with no digital intermediaries to run interference. Her rationale is blunt and bracing. Using AI on exams, she tells students, is like bringing a forklift to the gym when your goal is to build muscle. “The classroom is a gymnasium,” she explains. “I am your personal trainer. I want you to lift the weights.” Hartmann is not being punitive; she is being realistic about human psychology. Given a way to cheat ourselves out of effort—or out of a meaningful life—we will take it, not because we are corrupt, but because we are wired to conserve energy. That instinct once helped us survive. Now it quietly betrays us. A cheated education becomes a squandered one, and a squandered life does not merely stagnate; it decays. This is how Gollumification begins: not with villainy, but with avoidance.

    I agree entirely with Hartmann’s impulse, even if my method would differ. I would require students to make a fifteen-minute YouTube video in which they deliver their argument as a formal speech. I know from experience that translating a written argument into an oral one exposes every hollow sentence and every borrowed idea. The mind has nowhere to hide when it must speak coherently, in sequence, under the pressure of time and presence. Oral essays force students to metabolize their thinking instead of laundering it through a machine. They are a way of banning forklifts from the gym—not out of nostalgia, but out of respect for the human organism. If education is meant to strengthen rather than simulate intelligence, then forcing students to lift their own cognitive weight is not cruelty. It is preventive medicine against the slow, tragic, and all-too-modern disease of Gollumification.

  • Discretionary Use Principle

    Discretionary Use Principle

    The Discretionary Use Principle begins with a simple but demanding claim: tools are not inherently good or bad, but they become harmful when used without judgment, proportion, or purpose. Whether we are talking about food, technology, or media, the decisive factor is not purity but discretion—our ability to choose deliberately rather than reflexively. The principle rejects both absolutism and indulgence. It argues instead for a calibrated life, one that privileges nourishment over stimulation, depth over convenience, while still recognizing that modern life occasionally requires shortcuts. This framework is especially useful when thinking about analog versus digital living, where moralized categories often replace careful thinking.

    It is wise to carve out a large, non-negotiable block of each day in which machines are politely but firmly excluded—no screens glowing like anxious faces, no notifications tugging at your sleeve, no algorithm whispering what to want next. Go hike where the trail refuses to optimize itself. Lift weights in a garage with nothing but an AM radio crackling like a distant campfire. Write dreams and grievances by hand in a clothbound notebook while Bach or Coltrane keeps time. This is the analog world, and it feeds parts of the nervous system that silicon cannot reach. In this sense, analog living resembles whole foods: salmon that still tastes like water and muscle, almonds that require chewing, blueberries that stain your fingers. The more time you spend here, the less bloated your spirit becomes. Digital life, by contrast, often behaves like ultra-processed food: frictionless, hyper-palatable, engineered for compulsive return, and strangely unsatisfying no matter how much you consume.

    That analogy works—until it doesn’t. Not all analog living is virtuous, just as not all “whole foods” are benign when eaten without restraint. A steady diet of eggs, clotted cream, or beef heart can quietly undo you. Likewise, not all digital experience is junk. There are serious conversations on social platforms, lucid Substack essays, and educational YouTube channels that sharpen rather than dull attention. The mistake comes when we moralize categories instead of exercising judgment. Ultra-processed food is not a single moral villain; “processed” names a method, not a fate. Steel-cut oats, frozen berries, tofu, canned beans, and whole-grain bread are processed and still nutritionally intact. Even within the ultra-processed aisle, a minimally sweetened protein bar is not the same organism as a fluorescent snack cake designed to bypass satiety. The real danger is not processing itself but the familiar cartel of refined starches, added sugars, industrial fats, flavor engineering, and low nutritional payoff.

    Seen through the Discretionary Use Principle, the lesson is neither to flee the digital world nor to surrender to it. Eat whole foods most of the time. Live analog for long, uninterrupted stretches. But do not shun all processed foods or digital tools out of misplaced virtue. Use them when discretion, efficiency, and purpose demand it. Health—nutritional or psychological—is not preserved by purity tests. It is preserved by attentiveness, proportion, and the ongoing discipline of choosing nourishment over convenience, again and again, without pretending that the choice will ever be automatic.

  • Hyper-Efficiency Intoxication Will Change Higher Learning Forever

    Hyper-Efficiency Intoxication Will Change Higher Learning Forever

    Hyper-Efficiency Intoxication

    noun

    The dopamine-laced rush that occurs when AI collapses hours of cognitive labor into seconds, training the brain to mistake speed for intelligence and output for understanding. Hyper-Efficiency Intoxication sets in when the immediate relief of reclaimed time—skipped readings, instant summaries, frictionless drafts—feels so rewarding that slow thinking begins to register as needless suffering. What hooks the user is not insight but velocity: the sense of winning back life from effort itself. Over time, this chemical high reshapes judgment, making sustained attention feel punitive, depth feel inefficient, and authorship feel optional. Under its influence, students do not stop working; they subtly downgrade their role—from thinker to coordinator, from writer to project manager—until thinking itself fades into oversight. Hyper-Efficiency Intoxication does not announce itself as decline; it arrives disguised as optimization, quietly hollowing out the very capacities education once existed to build.

    ***

    No sane college instructor assigns an essay anymore under the illusion that you’ll heroically wrestle with ideas while AI politely waits in the hallway. We all know what happens: a prompt goes in, a glossy corpse comes out. The charade has become so blatant that even professors who once treated AI like a passing fad are now rubbing their eyes and admitting the obvious. Hua Hsu names the moment plainly in his essay “What Happens After A.I. Destroys College Writing?”: the traditional take-home essay is circling the drain, and higher education is being forced to explain—perhaps for the first time in decades—what it’s actually for.

    The problem isn’t that students are morally bankrupt. It’s that they’re brutally rational. The real difference between “doing the assignment” and “using AI” isn’t ethics; it’s time. Time is the most honest currency in your life. Ten hours grinding through a biography means ten hours you’re not at a party, a game, a date, or a job. Ten minutes with an AI summary buys you your evening back. Faced with that math, almost everyone chooses the shortcut—not because they’re dishonest, but because they live in the real world. This isn’t cheating; it’s survival economics.

    Then there’s the arms race. Your classmates are using AI. All of them. Competing against them without AI is like entering a bodybuilding contest while everyone else is juiced to the gills and you’re proudly “all natural.” You won’t be virtuous; you’ll be humiliated. Fairness collapses the moment one side upgrades, and pretending otherwise is naïve at best.

    AI also hooks you. Hsu admits that after a few uses of ChatGPT, he felt the “intoxication of hyper-efficiency.” That’s not a metaphor—it’s a chemical event. When a machine collapses hours of effort into seconds, your brain lights up like it just won a small lottery. The rush isn’t insight; it’s velocity. And once you’ve tasted that speed, slowness starts to feel like punishment.

    Writing instructors, finally awake, are adapting. Take-home essays are being replaced by in-class writing, blue books, and passage identification exams—formats designed to drag thinking back into the room and away from the cloud. These methods reward students who’ve spent years reading and writing the hard way. But for students who entered high school in 2022 or later—students raised on AI scaffolding—this shift feels like being dropped into deep water without a life vest. Many respond rationally: they avoid instructors who demand in-class thinking.

    Over time, something subtle happens. You don’t stop working; you change roles. You become, in Hsu’s phrase, a project manager—someone who coordinates machines rather than generating ideas. You collaborate, prompt, tweak, and oversee. And at some point, no one—not you, not your professor—can say precisely when the thinking stopped being yours. There is no clean border crossing, only a gradual fade.

    Institutions are paralyzed by this reality. Do they accept the transformation and train students to be elite project managers of knowledge? Or do they try to resurrect an older model of literacy, pretending that time, incentives, and technology haven’t changed? Neither option is comfortable, and both expose how fragile the old justifications for college have become.

    From the educator’s chair, the nightmare scenario is obvious. If AI can train competent project managers for coding, nursing, physical therapy, or business, why not skip college altogether? Why not certify skills directly? Why not let employers handle training in-house? It would be faster, cheaper, and brutally efficient.

    And efficiency always wins. When speed, convenience, and cost savings line up, they don’t politely coexist with tradition—they bulldoze it. AI doesn’t argue with the old vision of education. It replaces it. The question is no longer whether college will change, but whether it can explain why learning should be slower, harder, and less efficient than the machines insist it needs to be.

  • Algorithmic Infotainment Drift

    Algorithmic Infotainment Drift

    Algorithmic Infotainment Drift

    noun

    Algorithmic Infotainment Drift refers to the contemporary condition in which narrative forms—films, shows, lectures, even ideas themselves—quietly abandon storytelling and inquiry in favor of algorithm-friendly spectacle and aspirational marketing, masquerading as content. Under this drift, plot becomes a thin pretext for visual bait, characters exist to model bodies, lifestyles, or attitudes, and scenes function like clickable thumbnails optimized to trigger envy, desire, or self-loathing rather than thought. What appears to be entertainment is in fact an influencer ecosystem in disguise, where the viewer is nudged not to reflect but to compare, consume, and Google diets mid-scene. The danger is not merely aesthetic but cognitive: audiences, especially students, are trained to expect meaning without effort, stimulation without depth, and authority without rigor—conditions that erode sustained attention, flatten intellectual struggle, and make higher learning feel obsolete next to the frictionless dopamine loop of the feed.

    ***

    I decided to relaunch my bodybuilding ambitions in my sixties the way all serious men do: by watching Road House. This reboot stars a Jake Gyllenhaal so aggressively sculpted he looks less like an actor and more like a marble warning label—Michelangelo with a protein sponsor. He plays a drifting barroom enforcer in Key West, a man whose résumé consists entirely of fists and moral clarity. His job is to protect a beachside dive and its plucky owner (Jessica Williams) from corrupt local heavies, which naturally culminates in a showdown with Conor McGregor, who appears to have been marinated in rage, creatine, and whatever substances are banned three agencies ago. McGregor doesn’t so much act as vibrate menacingly, like a loose chainsaw wrapped in tattoos.

    The plot, such as it exists, is thinner than dental floss. It’s a Western with tank tops: a stranger rides into town, punches everyone who deserves it, and restores order through upper-body hypertrophy. But the story is a courtesy gesture. The real point is flesh. The camera caresses delts, glides lovingly over abs, and pauses reverently on veins like a pilgrim at a shrine. This isn’t cinema; it’s a two-hour sizzle reel for protein powder, creatine, and injectable optimism. The fights feel less choreographed than sponsored. Somewhere, a supplement brand is climaxing.

    Midway through, I realized I wasn’t watching a movie—I was undergoing a comparison audit. I reached for my phone, not to check messages, but to Google “Conor McGregor diet,” as one does when confronted with the existential horror of your own carb intake. Road House doesn’t invite immersion; it invites self-loathing. It’s not entertainment so much as a glossy intervention: a reminder that you are one donut away from structural collapse while these men are carved from imported stone.

    When the credits rolled, something clarifying settled in. We no longer tell stories; we stage aspirations. Movies have become influencer decks with dialogue—Algorithmic Infotainment Drift in its purest form. Narrative is now a delivery system for vibes, bodies, and monetizable fantasy. We don’t object because we’re trained not to. The film doesn’t pretend to mean anything; it just wants to convert you—into admiration, envy, and eventually consumption.

    This drift matters, especially in higher education, because it retrains the mind. Students steeped in this culture come to expect knowledge the way Road House delivers plot: fast, polished, emotionally pre-optimized, and free of resistance. Sustained attention gets replaced by binge reflexes. Analysis gives way to vibes. Long arguments feel offensive. Ambiguity feels like a bug. And authority quietly shifts from expertise to whatever the algorithm spotlights this week. Knowledge becomes something you scroll past, not wrestle with. The result isn’t ignorance—it’s fragility. A generation trained to consume meaning, not make it, flexing hard in a world that requires endurance.

  • Screen Bilinguals and Screen Natives

    Screen Bilinguals and Screen Natives

    Screen Bilinguals

    noun

    Screen Bilinguals are those who remember Pre-Screen Life and Post-Screen Life and can mentally translate between the two. They know what it felt like to disappear into a book without notifications, to wander outdoors without documenting the evidence, and to experience friendship without performance. They may use screens constantly now, but they retain an embodied memory of undistracted attention and uncurated presence. That memory gives them perspective—and often a quiet grief.

    Screen Natives

    noun

    Screen Natives are those who never lived outside the Attention Economy. They have no experiential baseline for pre-digital reading, boredom, or intimacy. For them, screens are not tools but atmosphere. Experience arrives already framed, shareable, and optimizable. Connection is inseparable from capture, and attention has always been contested territory. What Screen Bilinguals experience as loss, Screen Natives experience as reality itself—neither chosen nor questioned, simply inherited.

    ***

    I am reasonably sure that some of the best memories of my pre-screen adolescence would not survive contact with smartphones and social media. They required a kind of reckless presence that today’s technology quietly sabotages. Every summer from 1975 to 1979, my family—along with ten others—made a pilgrimage to Point Reyes Beach, where the Johnsons’ oyster farm supplied what appeared to be bottomless truck beds of shellfish. From noon until sunset, hundreds of us devoured obscene quantities of barbecued oysters dripping with garlic butter and Tabasco, flanked by thousands of loaves of garlic bread and slabs of chocolate cake so moist they bordered on indecent. Ignoring cheerful warnings about nearby great white sightings, we periodically sprinted into the Pacific, then staggered back to the picnic tables, pecs gleaming with saltwater, to resume eating like mythological beings. In the summer of ’78, I told my parents to leave without me and caught a ride home in the bed of a stranger’s truck. Stuffed beyond reason, convinced I was some minor sea god, I lay under the stars with a gang of people I’d met hours earlier, trading delirious stories and watching the universe spin. No one documented a thing. We didn’t track calories, curate moments, or worry about time. Life simply happened to us, and that was enough.

    Those memories now trouble me. Were they the accidental privilege of being screen-bilingual—raised before devices trained us to perform our lives in public? Does being a screen native quietly thin experience itself by insisting everything be captured, filtered, and offered up for consumption? Free from the reflex to mediate, I could disappear into the moment without irony or self-surveillance. Had I grown up with screens, the day would have demanded angles, captions, and metrics. The magic would have curdled under the pressure to perform. The idea that every experience must double as content strikes me as a curse—a low-grade exile from real life, where spontaneity dies not from malice but from documentation.

  • Academic Anedonia: A Tale in 3 Parts

    Academic Anedonia: A Tale in 3 Parts

    Academic Anhedonia

    noun

    Academic Anhedonia is the condition in which students retain the ability to do school but lose the capacity to feel anything about it. Assignments are completed, boxes are checked, credentials are pursued, yet curiosity never lights up and satisfaction never arrives. Learning no longer produces pleasure, pride, or even frustration—just a flat neurological neutrality. These students aren’t rebellious or disengaged; they’re compliant and hollow, moving through coursework like factory testers pressing buttons to confirm the machine still turns on. Years of algorithmic overstimulation, pandemic detachment, and frictionless AI assistance have numbed the internal reward system that once made discovery feel electric. The result is a classroom full of quiet efficiency and emotional frost: cognition without appetite, performance without investment, education stripped of its pulse.

    ***

    I started teaching college writing in the 80s under the delusion that I was destined to be the David Letterman of higher education—a twenty-five-year-old ham with a chalkboard, half-professor and half–late-night stand-up. For a while, the act actually worked. A well-timed deadpan joke could mesmerize a room of eighteen-year-olds and soften their outrage when I saddled them with catastrophically ill-chosen books (Ron Rosenbaum’s Explaining Hitler—a misfire so spectacular it deserves its own apology tour). My stories carried the class, and for decades I thought the laughter was evidence of learning. If I could entertain them, I told myself, I could teach them.

    Then 2012 hit like a change in atmospheric pressure. Engagement thinned. Phones glowed. Students behaved as though they were starring in their own prestige drama, and my classroom was merely a poorly lit set. I was no longer battling boredom—I was competing with the algorithm. This was the era of screen-mediated youth, the 2010–2021 cohort raised on the oxygen of performance. Their identities were curated in Instagram grids, maintained through Snapstreaks, and measured in TikTok microfame points. The students were not apathetic; they were overstimulated. Their emotional bandwidth was spent on self-presentation, comparison loops, and the endless scoreboard of online life. They were exhausted but wired, longing for authenticity yet addicted to applause. I felt my own attention-capture lose potency, but I still recognized those students. They were distracted, yes, but still alive.

    But in 2025, we face a darker beast: the academically anhedonic student. The screen-mediated generation ran hot; this one runs cold. Around 2022, a new condition surfaced—a collapse of the internal reward system that makes learning feel good, or at least worthwhile. Years of over-curation, pandemic detachment, frictionless AI answers, and dopamine-dense apps hollowed out the very circuits that spark curiosity. This isn’t laziness; it’s a neurological shrug. These students can perform the motions—fill in a template, complete a scaffold, assemble an essay like a flat-pack bookshelf—but they move through the work like sleepwalkers. Their curiosity is muted. Their persistence is brittle. Their critical thinking arrives pre-flattened. 

    My colleagues tell me their classrooms are filled with compliant but joyless learners checking boxes on their march toward a credential. The Before-Times students wrestled with ideas. The After-Times students drift through them without contact. It breaks our hearts because the contrast is stark: what was once noisy and performative has gone silent. Academic anhedonia names that silence—a crisis not of ability, but of feeling.

  • Algovorous

    Algovorous

    Algovorous
    adjective

    Characterized by habitual consumption of algorithmically curated stimuli that prioritize engagement over nourishment. An algovorous person feeds continuously on feeds, prompts, and recommendations, mistaking stimulation for insight. Attention erodes, resilience weakens, and depth is displaced by endless, low-friction intake.

    ***

    You don’t know any other world because you were born inside the Attention Economy. There was no “before” for you—no baseline against which to compare the glow of screens to a quieter, unmonetized mental life. So let me tell you something grim about the system you’ve inherited: it runs on engagement at all costs. Not truth. Not wisdom. Not even pleasure in any deep sense. Just engagement. As Jaron Lanier warns in Ten Arguments for Deleting Your Social Media Account Right Now, the economy works best when it bypasses your higher faculties and plugs directly into the brain’s most primitive circuitry. This is not the part of you that reasons, imagines, or aspires. It’s the reptile brain—the swampy basement where jealousy, envy, FOMO, and schadenfreude slosh around, waiting to be poked with a stick. Stimulate that region long enough and you don’t become thoughtful or fulfilled. You become reactive, agitated, and strangely hungry for more of the same poison.

    The platforms know this. A successful YouTuber doesn’t need insight; he needs targets. Hence the booming genre of downfall porn: endless autopsies of other people’s collapses. Take bodybuilding YouTube, a carnival of oiled torsos and moral rot. Greg Doucette, with his two-and-a-half million subscribers, has perfected the form. His brand is not training wisdom so much as public execution. He thrives on predicting the imminent demise of rival influencers, especially Mike Israetel, whose Renaissance Periodization channel—approaching four million subscribers—shows no interest in collapsing on schedule. That hasn’t stopped Doucette from announcing the funeral. He does it in a tank top, veins bulging, traps flared, voice pitched to a squeaky fury, filleting his subjects like a caffeinated fishmonger. The performance is manic, theatrical, and wildly successful. Rage, it turns out, scales beautifully.

    I’m not a psychiatrist, but you don’t need a medical degree to recognize a toxic loop when you see one. Mental health professionals would likely agree: this is dopamine farming. The audience gets a chemical jolt from watching others stumble while doing nothing to improve their own lives. It’s adrenaline for the bored, envy with a subscription button. In the Attention Economy, toxicity isn’t a bug—it’s the feature. The viewer doesn’t flourish; the algorithm does. You sit there, immobilized, a butterfly pinned to corkboard entertainment, while someone else’s revenue graph climbs. That is the deal on offer: your attention in exchange for distraction from the harder work of becoming a person.

  • Humanification

    Humanification

    It is not my job to indoctrinate you into a political party, a philosophical sect, or a religious creed. I am not here to recruit. But it is my job to indoctrinate you about something—namely, how to think, why thinking matters, and what happens when you decide it doesn’t. I have an obligation to give you a language for understanding critical thinking and the dangers of surrendering it, a framework for recognizing the difference between a meaningful life and a comfortable one, and the warning signs that appear when convenience, short-term gratification, and ego begin quietly eating away at the soul. Some of you believe life is a high-stakes struggle over who you become. Others suspect the stakes are lower. A few—regrettably—flirt with nihilism and conclude there are no stakes at all. But whether you dramatize it or dismiss it, the “battle of the soul” is unavoidable. I teach it because I am not a vocational trainer turning you into a product. I am a teacher in the full, unfashionable sense of the word—even if many would prefer I weren’t.

    This battle became impossible to ignore when I returned to the classroom after the pandemic and met ChatGPT. On one side stood Ozempification: the seductive shortcut. It promises results without struggle, achievement without formation, output without growth. Why wrestle with ideas when a machine can spit out something passable in seconds? It’s academic fast food—calorie-dense, spiritually empty, and aggressively marketed. Excellence becomes optional. Effort becomes suspicious. Netflix beckons. On the other side stood Humanification: the old, brutal path that Frederick Douglass knew by heart. Literacy as liberation. Difficulty as transformation. Meaning earned the hard way. Cal Newport calls it deep work. Jordan Peele gives it a name—the escape from the Sunken Place. Humanification doesn’t chase comfort; it chases depth. The reward isn’t ease. It’s becoming someone.

    Tyler Austin Harper’s essay “ChatGPT Doesn’t Have to Ruin College” captures this split perfectly. Wandering Haverford’s manicured campus, he encounters English majors who treat ChatGPT not as a convenience but as a moral hazard. They recoil from it. “I prefer not to,” Bartleby-style. Their refusal is not naïveté; it’s identity. Writing, for them, is not a means to a credential but an act of fidelity—to language, to craft, to selfhood. But Harper doesn’t let this romanticism off the hook. He reminds us, sharply, that honor and curiosity are not evenly distributed virtues. They are nurtured—or crushed—by circumstance.

    That line stopped me cold. Was I guilty of preaching Humanification without acknowledging its price tag? Douglass pursued literacy under threat of death, but he is a hero precisely because he is rare. We cannot build an educational system that assumes heroic resistance as the norm. Especially not when the very architects of our digital dystopia send their own children to screen-free Waldorf schools, where cursive handwriting and root vegetables are treated like endangered species. The tech elite protect their children from the technologies they profit from. Everyone else gets dopamine.

    I often tell students this uncomfortable truth: it is easier to be an intellectual if you are rich. Wealth buys time, safety, and the freedom to fail beautifully. You can disappear to a cabin, read Dostoevsky, learn Schubert, and return enlightened. Most students don’t have that option. Harper is right—institutions like Haverford make Humanification easier. Small classes. Ample support. Unhurried faculty. But most students live elsewhere. My wife teaches in public schools where buildings leak, teachers sleep in cars, and safety is not guaranteed. Asking students in survival mode to honor an abstract code of intellectual purity borders on insult.

    Maslow understood this long ago. Self-actualization comes after food, shelter, and security. It’s hard to care about literary integrity when you’re exhausted, underpaid, and anxious. Which is why the Ozempic analogy matters. Just as expensive GLP-1 drugs make discipline easier for some bodies, elite educational environments make intellectual virtue easier for some minds. Character still matters—but it is never the whole story.

    Harper complicates things further by comparing Haverford to Stanford. At Stanford, honor codes collapse under scale; proctoring becomes necessary. Intimacy, not virtue alone, sustains integrity. Haverford begins to look less like a model and more like a museum—beautiful, instructive, and increasingly inaccessible. The humanities survive there behind velvet ropes.

    I teach at a community college. My students are training for nursing, engineering, business. They work multiple jobs. They sleep six hours if they’re lucky. They don’t have the luxury to marinate in ideas. Humanification gets respectful nods in class discussions, but Ozempification pays the rent. And pretending otherwise helps no one.

    This is the reckoning. We cannot shame students for using AI when AI is triage, not indulgence. But we also cannot pretend that a life optimized for convenience leads anywhere worth going. The challenge ahead is not to canonize the Humanified or condemn the Ozempified. It is to build an educational culture where aspiration is not a luxury good—where depth is possible without privilege, and where using AI does not require selling your soul for efficiency.

    That is the real battle. And it’s one we can’t afford to fight dishonestly.

  • Ozempification: A Cautionary Tale

    Ozempification: A Cautionary Tale

    The 2025 Los Angeles wildfires, blazing with apocalyptic fury, prompted me to do something I hadn’t done in years: dust off one of my radios and tune into live local news. The live broadcast brought with it not just updates but an epiphany. Two things, in fact. First, I realized that deep down, I despise my streaming devices—their algorithm-driven content is like an endless conveyor belt of lukewarm leftovers, a numbing backdrop of music and chatter that feels canned, impersonal, and incurably distant. Worst of all, these devices have pushed me into a solipsistic bubble, a navel-gazing universe where I am the sole inhabitant. Streaming has turned my listening into an isolating, insidious form of solitary confinement, and I haven’t even noticed.

    When I flipped on the radio in my kitchen, the warmth of its live immediacy hit me like a long-lost friend. My heart ached as memories of radio’s golden touch from my youth came flooding back. As a nine-year-old, after watching Diahann Carroll in Julia and Sally Field in The Flying Nun, I’d crawl into bed, armed with my trusty transistor radio and earbuds, ready for the night to truly begin. Tuned to KFRC 610 AM, I’d be transported into the shimmering world of Sly and the Family Stone’s “Hot Fun in the Summertime,” Tommy James and the Shondells’ “Crystal Blue Persuasion,” and The Friends of Distinction’s “Grazing in the Grass.” The knowledge that thousands of others in my community were swaying to the same beats made the experience electric, communal, alive—so unlike the deadening isolation of my curated streaming playlists.

    The fires didn’t just torch the city—they laid bare the fault lines in my craving for connection. Nostalgia hit like a sucker punch, sending me down an online rabbit hole in search of a high-performance radio, convinced it could resurrect the magic of my youth. Deep down, a sardonic voice heckled me: was this really about better reception, or just another pitiful attempt by a sixty-something man trying to outrun mortality? Did I honestly believe a turbo-charged radio could beam me back to those transistor nights and warm kitchen conversations, or was I just tuning into the static of my own existential despair?

    Streaming had wrecked my relationship with music, plain and simple. The irony wasn’t lost on me either. While I warned my college students not to let ChatGPT lull them into embracing mediocre writing, I had let technology seduce me into a lazy, soulless listening experience. Hypocrisy alert: I had become the very cautionary tale I preached against.

    Enter what I now call “Ozempification,” inspired by that magical little injection, Ozempic, which promises a sleek body with zero effort. It’s the tech-age fantasy in full force: the belief that convenience can deliver instant gratification without any downside. Spoiler alert—it doesn’t. The price of that fantasy is steep: convenience kills effort, and with it, the things that actually make life rich and rewarding. Bit by bit, it hollows you out like a bad remix, leaving you a hollow shell of passive consumption.

    Over time, you become an emotionally numb, passive tech junkie—a glorified NPC on autopilot, scrolling endlessly through algorithms that decide your taste for you. The worst part? You stop noticing. The soundtrack to your life is reduced to background noise, and you can’t even remember when you lost control of the plot.

    But not all Ozempification is a one-way ticket to spiritual bankruptcy. Sometimes, it’s a lifeline. GLP-1 drugs like Ozempic can literally save lives, keeping people with severe diabetes from joining the ranks of organ donors earlier than planned. Meanwhile, overworked doctors are using AI to diagnose patients with an accuracy that beats the pre-AI days of frantic guesswork and “Let’s Google that rash.” That’s Necessary Ozempification—the kind that keeps you alive or at least keeps your doctor from prescribing antidepressants instead of antibiotics.

    The true menace isn’t just technology—it’s Mindless Ozempification, where convenience turns into a full-blown addiction. Everything—your work, your relationships, even your emotional life—gets flattened into a cheap, prepackaged blur of instant gratification and hollow accomplishment. Suddenly, you’re just a background NPC in your own narrative, endlessly scrolling for a dopamine hit like a lab rat stuck in a particularly bleak Skinner box experiment.

    As the fires in L.A. fizzled out, I had a few weeks to prep my writing courses. While crafting my syllabus and essay prompts, Mindless Ozempification loomed large in my mind. Why? Because I was facing the greatest challenge of my teaching career: staying relevant when my students had a genie—otherwise known as ChatGPT—at their beck and call, ready to crank out essays faster than you can nuke a frozen burrito.

    After four years of wrestling with AI-assisted essays and thirty-five years in the classroom, I’ve learned something unflattering about human nature—especially my own. We are exquisitely vulnerable to comfort, shortcuts, and the soft seduction of the path of least resistance. Given enough convenience, we don’t just cut corners; we slowly anesthetize ourselves. That quiet slide—where effort feels offensive and difficulty feels unnecessary—is the endgame of Ozempification: not improvement, but a gentle, smiling drift toward spiritual atrophy.