Blog

  • AI Is a Gym, But the Students Need Muscles

    AI Is a Gym, But the Students Need Muscles

    In his bracing essay “Colleges Are Preparing to Self-Lobotomize,” Michael Clune accuses higher education of handling AI with the institutional equivalent of a drunk chainsaw. The subtitle gives away the game: “The skills that students will need in an age of automation are precisely those that are eroding by inserting AI into the educational process.” Colleges, Clune argues, spent the first three years of generative AI sitting on their hands. Now they are overcorrecting in a frenzy, embedding AI everywhere as if saturation were the same thing as competence. It isn’t. It’s panic dressed up as innovation.

    The prevailing assumption seems to be that if AI is everywhere, mastery will somehow emerge by osmosis. But what’s actually happening is the opposite. Colleges are training students to rely on frictionless services while neglecting the very capacities that make AI usable in any meaningful way: critical thinking, the ability to learn new things, and flexible modes of analysis. The tools are getting smarter; the users are getting thinner.

    Clune faces a genuine rhetorical problem. He keeps insisting that we need abstractions—critical thinking, intellectual flexibility, judgment—but we live in a culture that has been trained to distrust anything abstract. Telling a screen-saturated society to imagine thinking outside screens is like telling a fish to imagine life outside water. The first task, then, is not instruction but translation: What is critical thinking, and how do you sell it to people addicted to immediate, AI-generated results?

    The fish analogy holds. A fish is aquatic; water is not a preference but a prison. A young person raised entirely on screens, prompts, and optimization tools treats that ecosystem as reality itself. Like the fish, they are confined to a single environment. We can name this condition precisely. They are cognitively outsourced, trained to delegate thinking to machines as if this were normal or healthy. They are algovorous, endlessly stimulated by algorithms that quietly erode attention and resilience. They are digitally obligate, unable to function without mediation. By definition, these orientations exclude critical thinking. They produce people who are functional inside digital systems and dysfunctional everywhere else.

    Drop such a person into a college that recklessly embeds AI into every course in the name of being “future-proof,” and you send them into the job market as a fragile, narrow organism. In some fields, they will be unemployable. Clune points to a telling statistic: history majors currently have an unemployment rate roughly half that of recent computer science graduates. The implication is brutal. Liberal arts training produces adaptability. Coding alone does not. As the New York Times put it in a headline Clune cites, “Goodbye, $165,000 Tech Jobs. Student Coders Seek Work at Chipotle.” AI is replacing coders. A life spent inside a tiny digital ecosystem does not prepare you for a world that mutates.

    Is AI the cause of this dysfunction? No. The damage was done long before ChatGPT arrived. I use AI constantly, and I enjoy it. It sharpens my curiosity. It helps me test ideas. It makes me smarter because I am not trapped inside it. I have a life beyond screens. I have read thousands of books. I can zoom in and out—trees and forest—without panic. I have language for my inner life, which means I can diagnose myself when I become maudlin, entropic, dissolute, misanthropic, lugubrious, or vainglorious. I have history, philosophy, and religion as reference points. All of this adds up to what we lazily call “critical thinking,” but what it really means is being fully human.

    Someone who has outsourced thought and imagination from childhood cannot suddenly use AI well. They are neither liberated nor empowered. They are brittle, dependent, and easily replaced.

    Because I am a lifelong weightlifter, I’ll offer a more concrete analogy. AI is a massive, state-of-the-art gym: barbells, dumbbells, Smith machines, hack squats, leg presses, lat pulldowns, pec decks, cable rows, preacher curls—the works. Now imagine you’ve never trained before. You’re twenty-eight, inspired by Instagram physiques, and vaguely determined to “get in shape.” You walk into this cathedral of iron with no plan, no understanding of hypertrophy, recovery, protein intake, progressive overload, or long-term discipline. You are surrounded by equipment, but you are lost. Within a month, you will quit. You’ll join the annual migration of January optimists who vanish by February, leaving the gym once again to the regulars.

    AI is that gym. It will eject most users. Not because it is hostile, but because it demands capacities they never developed. Some people will learn isolated tasks—prompting here, automating there—but only in the way someone learns to push a toaster lever. These tasks should not define a human being. When they do, the result is a Non Player Character: reactive, scripted, interchangeable.

    Young people already understand what an NPC is. That’s why they fear becoming one.

    If colleges recklessly embed AI into every corner of the curriculum, they are not educating thinkers. They are manufacturing NPCs. And for that, they deserve public shame.

  • Stupidification Didn’t Start with AI—It Just Got Faster

    Stupidification Didn’t Start with AI—It Just Got Faster

    What if AI is just the most convenient scapegoat for America’s long-running crisis of stupidification? What if blaming chatbots is simply easier than admitting that we have been steadily accommodating our own intellectual decline? In “Stop Trying to Make the Humanities ‘Relevant,’” Thomas Chatterton Williams argues that weakness, cowardice, and a willing surrender to mediocrity—not technology alone—are the forces hollowing out higher education.

    Williams opens with a bleak inventory of the damage. Humanities departments are in permanent crisis. Enrollment is collapsing. Political hostility is draining funding. Smartphones and social media are pulverizing attention spans, even at elite schools. Students and parents increasingly question the economic value of any four-year degree, especially one rooted in comparative literature or philosophy. Into this already dire landscape enters AI, a ready-made proxy for writing instructors, discussion leaders, and tutors. Faced with this pressure, colleges grow desperate to make the humanities “relevant.”

    Desperation, however, produces bad decisions. Departments respond by accommodating shortened attention spans with excerpts instead of books, by renaming themselves with bloated, euphemistic titles like “The School of Human Expression” or “Human Narratives and Creative Expression,” as if Orwellian rebranding might conjure legitimacy out of thin air. These maneuvers are not innovations. They are cost-cutting measures in disguise. Writing, speech, film, philosophy, psychology, and communications are lumped together under a single bureaucratic umbrella—not because they belong together, but because consolidation is cheaper. It is the administrative equivalent of hospice care.

    Williams, himself a humanities professor, argues that such compromises worsen what he sees as the most dangerous threat of all: the growing belief that knowledge should be cheap, easy, and frictionless. In this worldview, learning is a commodity, not a discipline. Difficulty is treated as a design flaw.

    And of course this belief feels natural. We live in a world saturated with AI tutors, YouTube lectures, accelerated online courses, and productivity hacks promising optimization without pain. It is a brutal era—lonely, polarized, economically unforgiving—and frictionless education offers quick solace. We soothe ourselves with dashboards, streaks, shortcuts, and algorithmic reassurance. But this mindset is fundamentally at odds with the humanities, which demand slowness, struggle, and attention.

    There exists a tiny minority of people who love this struggle. They read poetry, novels, plays, and polemics with the obsessive intensity of a scientist peering into a microscope. For them, the intellectual life supplies meaning, irony, moral vocabulary, civic orientation, and a deep sense of interiority. It defines who they are. These people often teach at colleges or work on novels while pulling espresso shots at Starbucks. They are misfits. They do not align with the 95 percent of the world running on what I call the Hamster Wheel of Optimization.

    Most people are busy optimizing everything—work, school, relationships, nutrition, exercise, entertainment—because optimization feels like survival. So why wouldn’t education submit to the same logic? Why take a Shakespeare class that assigns ten plays in a language you barely understand when you can take one that assigns a single movie adaptation? One professor is labeled “out of touch,” the other “with the times.” The movie-based course leaves more time to work, to earn, to survive. The reading-heavy course feels indulgent, even irresponsible.

    This is the terrain Williams refuses to romanticize. The humanities, he argues, will always clash with a culture devoted to speed, efficiency, and frictionless existence. The task of the humanities is not to accommodate this culture but to oppose it. Their most valuable lesson is profoundly countercultural: difficulty is not a bug; it is the point.

    Interestingly, this message thrives elsewhere. Fitness and Stoic influencers preach discipline, austerity, and voluntary hardship to millions on YouTube. They have made difficulty aspirational. They sell suffering as meaning. Humanities instructors, despite possessing language and ideas, have largely failed at persuasion. Perhaps they need to sell the life of the mind with the same ferocity that fitness influencers sell cold plunges and deadlifts.

    Williams, however, offers a sobering reality check. At the start of the semester, his students are electrified by the syllabus—exploring the American Dream through Frederick Douglass and James Baldwin. The idea thrills them. The practice does not. Close reading demands effort, patience, and discomfort. Within weeks, enthusiasm fades, and students quietly outsource the labor to AI. They want the identity of intellectual rigor without submitting to its discipline.

    After forty years of teaching college writing, this pattern is painfully familiar to me. Students begin buoyant and curious. Then comes the reading. Then comes the checkout.

    Early in my career, I sustained myself on the illusion that I could shape students in my own image—cultivated irony, wit, ruthless critical thinking. I wanted them to desire those qualities and mistake my charisma as proof of their power. That fantasy lasted about a decade. Eventually, realism took over. I stopped needing them to become like me. I just wanted them to pass, transfer, get a job, and survive.

    Over time, I learned something paradoxical. Most of my students are as intelligent as I am in raw terms. They possess sharp BS detectors and despise being talked down to. They crave authenticity. And yet most of them submit to the Hamster Wheel of Optimization—not out of shallowness, but necessity. Limited time, money, and security force them onto the wheel. For me to demand a life of intellectual rigor from them often feels like Don Quixote charging a windmill: noble, theatrical, and disconnected from reality.

    Writers like Thomas Chatterton Williams are right to insist that AI is not the root cause of stupidification. The wheel would exist with or without chatbots. AI merely makes it easier to climb aboard—and makes it spin faster than ever before.

  • AI Normalization and the Death of Sacred Time

    AI Normalization and the Death of Sacred Time

    In “AI Has Broken High School and College,” Damon Beres stages a conversation between Ian Bogost and Lila Shroff that lands like a diagnosis no one wants but everyone recognizes. Beres opens with a blunt observation: today’s high school seniors are being told—implicitly and explicitly—that their future success rests on their fluency with chatbots. School is no longer primarily about learning. It has become a free-for-all, and teachers are watching from the sidelines with a whistle that no longer commands attention.

    Bogost argues that educators have responded by sprinting toward one of two unhelpful extremes: panic or complacency. Neither posture grapples with reality. There is no universal AI policy, and students are not using ChatGPT only to finish essays. They are using it for everything. We have already entered a state of AI normalization, where reliance is no longer an exception but a default. To explain the danger, Bogost borrows a concept from software engineering: technical debt—the seductive habit of choosing short-term convenience while quietly accruing long-term catastrophe. You don’t fix the system; you keep postponing the reckoning. It’s like living on steak, martinis, and banana splits while assuring yourself you’ll start jogging next year.

    Higher education, Bogost suggests, has compounded the problem by accumulating what might be called pedagogical debt. Colleges never solved the hard problems: smaller class sizes, meaningful writing assignments, sustained feedback, practical skill-building, or genuine pipelines between students and employers. Instead, they slapped bandages over these failures and labeled them “innovation.” AI didn’t create these weaknesses; it simply makes it easier to ignore them. The debt keeps compounding, and the interest is brutal.

    Bogost introduces a third and more existential liability: the erosion of sacred time. Some schools still teach this—places where students paint all day, rebuild neighborhoods, or rescue animals, learning that a meaningful life requires attention, patience, and presence. Sacred time resists the modern impulse to finish everything as fast as possible so you can move on to the next task. AI dependence belongs to a broader pathology: the hamster wheel of deadlines, productivity metrics, and permanent distraction. In that world, AI is not liberation. It is a turbocharger for a life without meaning.

    AI also accelerates another corrosive force: cynicism. Students tell Bogost that in the real world, their bosses don’t care how work gets done—only that it gets done quickly and efficiently. Bogost admits they are not wrong. They are accurately describing a society that prizes output over meaning and speed over reflection. Sacred time loses every time it competes with the rat race.

    The argument, then, is not a moral panic about whether to use AI. The real question is what kind of culture is doing the using. In a system already bloated with technical and pedagogical debt, AI does not correct course—it smooths the road toward a harder crash. Things may improve eventually, but only after we stop pretending that faster is the same as better, and convenience is the same as progress.

  • AI High School Graduates Are Here—and They’re Better Cheaters Than We Are Teachers

    AI High School Graduates Are Here—and They’re Better Cheaters Than We Are Teachers

    Lila Shroff argues that education has entered its Wild West phase in her essay “The AI Takeover of Education Is Just Getting Started,” and she’s right in the way that makes administrators nervous and instructors tired. Our incoming college students are not stumbling innocents. They are veterans of four full years of AI high school. They no longer dabble in crude copy-and-paste plagiarism. That’s antique behavior. Today’s students stitch together outputs from multiple AI models, then instruct the chatbot to scuff the prose with a few grammatical missteps so it smells faintly human and slips past detection software. This is not cheating as shortcut; it is cheating as workflow optimization.

    Meanwhile, high school teachers may be congratulating themselves for assigning Shakespeare, Keats, and Dostoevsky, but many are willfully ignoring the obvious. Students are using AI constantly—for summaries, study guides, feedback, and comprehension scaffolding. AI is CliffsNotes on growth hormones, and pretending otherwise is an exercise in institutional denial.

    Educators, of course, are not standing outside the saloon wagging a finger. They are inside, ordering drinks. Shroff notes that teachers now use AI to design assignments, align curriculum to standards, grade against rubrics, and complete the paperwork that keeps schools legally hydrated. Nearly a third of K–12 teachers reported weekly AI use last year, and that number has only climbed as profession-specific tools like MagicSchool AI churn out rubrics, worksheets, and report-card comments on demand. The teacher as craftsman is quietly becoming the teacher as editor.

    AI’s grip tightens most aggressively where schools are already bleeding resources. In districts short on tutors and counselors, AI steps in as a substitute for services that were never funded in the first place. It is not reform; it is triage. And once institutions get a taste of saving money by not hiring tutors and counselors, it is naïve to think that teaching positions will remain untouchable. Cost-saving rarely stops at the first ethical boundary it crosses.

    That is why this feels like the Wild West. There is no shared map. Some schools welcome AI like a messiah. Others quarantine it like a contagious disease. Many simply shrug and admit they are baffled. Policy is reactive, inconsistent, and often written by people who do not understand the technology well enough to regulate it intelligently.

    I see the consequences weekly in my college classroom. I read plenty of AI slop—essays with perfect grammar and no pulse, paragraphs that gesture toward ideas they never quite touch. Some students have clearly checked out, outsourcing not just sentences but thinking itself. And yet AI is also an undeniable equalizer. Students emerging from underfunded schools with sixth-grade literacy levels are submitting essays with clean syntax and logical structure. They are using AI to outline arguments, test thesis ideas, and stabilize skills they were never taught. The tool giveth and the tool holloweth out.

    People like to invoke “too big to fail,” but the analogy is incomplete. We do not know which AI—ChatGPT, Gemini, Claude, or some yet-unseen contender—will dominate. What we do know is that AI is already embedded in education, culture, and the economy. There is no reversing this process. The toothpaste is not going back in the tube, no matter how sternly we lecture it.

    So I tell my students the only honest thing left to say: we don’t know what we’re doing. Our roles are unsettled. Our identities are unstable. We are feeling our way through a dark cave without a map and without guarantees. There may be light ahead, or there may not. The only sane posture is humility—paired with curiosity, caution, and a sober gratitude that even a force this disruptive may yield benefits we are not yet wise enough to recognize.

  • The Copy-Paste Generation and the Myth of the Fallen Classroom

    The Copy-Paste Generation and the Myth of the Fallen Classroom

    There is no ambiguity in Ashanty Rosario’s essay title: “I’m a High Schooler. AI Is Demolishing My Education.” If you somehow miss the point, the subtitle elbows you in the ribs: “The end of critical thinking in the classroom.” Rosario opens by confessing what every honest student now admits: she doesn’t want to cheat with AI, but the tools are everywhere, glowing like emergency exits in a burning building. Some temptations are structural.

    Her Exhibit A is a classmate who used ChatGPT to annotate Narrative of the Life of Frederick Douglass. These annotations—supposed evidence of engaged reading—were nothing more than “copy-paste edu-lard,” a caloric substitute for comprehension. Rosario’s frustration reminds me of a conversation with one of my brightest students. On the last day of class, he sat in my office and casually admitted that he uses ChatGPT to summarize all his reading. His father is a professor; he wakes up at five for soccer practice; he takes business calculus for fun. He is not a slacker. He is a time-management pragmatist surviving the 21st century. He reads the AI summaries, synthesizes them, and writes excellent essays. Of course I’d love for him to spend slow hours with books, but he is not living in 1954. He is living in a culture where time is a scarce resource, and AI is his oxygen mask.

    My daughters and their classmates face the same problem with Macbeth. Shakespeare’s language might as well be Martian for a generation raised on TikTok compression and dopamine trickle-feeds. They watch film versions of the play and use AI to decode plot points so they can answer the teacher’s study questions without sounding like they slept through the Renaissance. Some purists will howl that this is intellectual cheating. But as a writing instructor, I suspect the teacher benefits from students who at least know what’s happening—even if their knowledge comes from a chatbot. Expecting a 15-year-old to read Macbeth cold is like assigning tensor calculus to a preschooler. They haven’t done their priors. So AI becomes a prosthetic. A flotation device. A translation machine dropped into a classroom years overdue. To blame AI for the degradation of education is tempting, but it’s also lazy. We live in a society where reading is a luxury good and the leisure class quietly guards the gates.

    In the 1970s, I graduated from a public high school with literacy skills so thin you could read the room through them. I took remedial English my freshman year of college. If I were a student today, dropped into 2025 with those same deficits, I would almost certainly lean on AI just to keep my head above water. The difference is that today’s students aren’t just supplementing—they’re optimizing. They tell me this openly: over ninety percent of my students use AI because their skills don’t match the workload and because, frankly, everyone else is doing it. It’s an arms race of survival, not a moral collapse.

    Still, Rosario is right about the aftermath. She writes: “AI has softened the consequences of procrastination and led many students to avoid doing any work at all. There is little intensity anymore.” When thinking becomes optional, students drift into a kind of algorithmic sleepwalking. They outsource cognition until they resemble NPCs in a glitching video game—avatars performing human imitation rather than human thought. My colleagues and I see it, semester after semester: the fade-out, the disengagement, the slow zombification.

    Colleges are scrambling to respond. Should we police AI with plagiarism detectors? Should we ban laptops and force students to write essays in composition books under watchful eyes like parolees in a literary halfway house? Should we pretend the flood can be stopped with a beach towel?

    Reading Rosario’s lament about “cookie-cutter AI arguments,” I thought of my one visit to Applebee’s in the early 2000s. The menu photos promised ambrosia. The food tasted like something engineered in a lab to be technically edible yet spiritually vacant. Applebee’s was AI before AI—an assembly line of flavorless simulacra. Humanity gravitates toward the easy, the prepackaged, the frictionless. AI didn’t invent mediocrity. It merely handed it a megaphone.

    Rosario, clearly, is not an Applebee’s soul. She’s Michelin-level in a world eager to eat microwaved Hot Pockets. Of course her heart sinks when classmates settle for fast-food literacy. I want to tell her that if she were in high school in the 1970s, she’d still witness an appetite for shortcut learning. The tools would be different, the essays less slick, but the gravitational pull toward mediocrity would be the same. The human temptation to bypass difficulty is not technological—it’s ancestral. AI simply automates the old hunger.

  • Artificial Intelligence and the Collapse of Classroom Thinking (college essay prompt)

    Artificial Intelligence and the Collapse of Classroom Thinking (college essay prompt)

    Artificial intelligence now drafts thesis statements, outlines arguments, rewrites weak prose, and gives students a shortcut past the cognitive struggle that learning used to require. Some critics warn that AI corrodes motivation, weakens mastery, and turns students into spectators of their own minds. Others argue that AI is merely revealing the truth we refused to confront: that modern education was already driven by templates, disengagement, and shallow assessment long before ChatGPT arrived. Still others suggest the two forces interact in a feedback loop—an educational system already limping is now asked to carry a technological weight it cannot bear.

    Write an argumentative essay in which you address the following question:

    To what extent is AI responsible for the erosion of student learning, and to what extent does it merely amplify the structural weaknesses already embedded in contemporary education?

    Your position may argue that:

    • AI is the primary driver of decline,
    • systemic failures are the primary driver,
    • or both forces interact in a way that cannot be separated.
      This is not a binary assignment—your task is to map the relationship between these forces with precision and evidence.

    Assigned Readings

    You must use at least four writers from the following list as central sources in your essay.
    You may also draw from additional credible sources.

    Critics who argue AI is damaging education

    1. Ashanty Rosario — “I’m a High Schooler. AI Is Demolishing My Education.”
    2. Lila Shroff — “The AI Takeover of Education Is Just Getting Started.”
    3. Damon Beres — “AI Has Broken High School and College.”
    4. Michael Clune — “Colleges Are Preparing to Self-Lobotomize.”

    Writers who shift the crisis away from AI

    1. Ian Bogost — “College Students Have Already Changed Forever.”
    2. Tyler Austin Harper — “The Question All Colleges Should Ask Themselves About AI.”
    3. Tyler Austin Harper — “ChatGPT Doesn’t Have to Ruin College.”
    4. John McWhorter — “My Students Use AI. So What?”

    Your Essay Must Include the Following Components

    1. Analyze one critic who argues AI is corrosive.

    Choose one writer who describes how AI erodes motivation, mastery, identity, intellectual struggle, or authentic thinking.
    Identify the mechanism of harm:
    How does AI disrupt learning—and where, exactly, does the breakdown occur?

    2. Analyze one writer who shifts blame away from AI.

    Choose a writer who argues that the crisis originates in curriculum design, academic culture, standardized writing templates, disengagement, or institutional inertia.
    Explain their diagnosis:
    What was broken before AI entered the classroom?

    3. Develop your own argument that maps the relationship between these forces.

    Your task is to explain how AI and the educational system interact.
    Does AI accelerate a decline already underway?
    Does it expose weaknesses the system refuses to address?
    Or does it create problems the system is too brittle to manage?
    Define the threshold:
    When does AI function as a constructive learning tool, and when does it become a crutch that erases struggle and depth?

    4. Include a substantial counterargument and rebuttal.

    Address the strongest opposing viewpoint—not a caricature—and respond with evidence and reasoning.

    Requirements

    • Minimum of 4 credible sources (MLA)
    • At least 4 assigned essays
    • MLA Works Cited
    • An essay that argues, rather than summarizes

    Guiding Question

    What kind of intellectual culture emerges when AI becomes normal—and who (or what) is ultimately responsible for shaping that culture?

  • Farewell to the High-Flame Watch Obsession

    Farewell to the High-Flame Watch Obsession

    If someone asks, “Are you still into watches?” the honest answer is yes—but only in the slow-cooker sense of the word. The blaze that once roared is now a gentle simmer. I still enjoy my small, modest collection, but the thermonuclear fervor that once powered my YouTube monologues has cooled to something approaching sanity. For a decade I curated my watch fixation online with the zeal of a man possessed. That’s part of the job: intensity, enthusiasm, obsession on command. You don’t just talk about watches; you produce engagement about the engagement, feeding the ouroboros of social media in which people watch reaction videos about reaction videos reacting to the initial spark. It’s performance art—performance about performance.

    But those days are over. I am retired from the high-flame watch world. Age has something to do with it—priorities recalibrate whether you consent or not. At sixty-four, the thrill of “wrist presence” and the quiet barbarism of masculinity farming with a steel hockey puck strapped to my arm don’t summon the same dopamine. The fantasy of a watch transforming me into a rugged Alpha Male now feels like cosplay designed by an exhausted algorithm.

    The bigger shift, though, is psychological. I haven’t bought a watch in five months. I no longer spray Instagram with daily wrist shots. I no longer agonize over whether to vaporize five grand on this dial or that bezel or which “ultimate rotation” best aligns with my personal mythology. The absence of that noise feels like relief—a weight lifted, a gratitude bordering on spiritual.

    Low-flame mode offers a different kind of bandwidth. I can sit at my desk in the morning with no cravings, no micro-desires, no consumer fantasies tugging at my neurons. I can actually face the quiet—deal with the emptiness directly rather than embalming it with luxury steel. That absence is clarifying. It demands something of me besides swiping a credit card.

    Does low-flame mode mean I’ve quit watches? No—it means I’ve quit a particular orientation toward watches. This essay grew out of a small revelation I had yesterday: you don’t retire from X entirely, and X doesn’t retire from you entirely either. Instead, you negotiate a polite breakup. You acknowledge each other’s contributions, exchange your things, and move on. The High-Flame Watch Obsession and I have parted ways. We won’t be seen in public together again.

    Do I mourn this? Not really. I have complicated feelings, sure, but I don’t feel like Lot’s Wife, craning my neck for one last look at the fever swamp of my own compulsions. Mostly, I feel relieved. Mostly, I feel curious—what will life look like now that my brain is no longer a storage unit for lug widths, torque tolerances, and bracelet micro-adjustments? The quiet is unsettling, but it’s also promising. I finally have room for something else.

  • Breakfast Grains and Other Existential Threats As I Embark Upon a Two-Month Vacation

    Breakfast Grains and Other Existential Threats As I Embark Upon a Two-Month Vacation

    Today is my last day of class before I’m loosed into a two-month intermission—a stretch of time that must be handled like a late-arrival character in a film. This visitor has a history with me, knows my flaws, and demands that I greet him with something better than the usual slouch and shrug.

    Naturally, I’ll rehab the shoulder, write, and play the piano. Exercise will take care of itself; addiction is nothing if not reliable. Food, however, is the saboteur lurking in my blind spot. My emotional attachments to breakfast grains would make a Freudian blush: buckwheat groats, steel-cut oats, rolled oats, vanilla protein powder, cinnamon, berries, nuts. The whole wholesome choir. Trouble is, those virtuous bowls can turn caloric faster than a Hallmark plot twist.

    These cereals, if I’m honest, are less about hunger and more about the psychic umbilical cord. They point back to Mother, the Womb, or—in Phil Stutz’s terms—the Comfort Zone, the Warm Bath. Linger too long in that morning porridge spa, and the scale begins to stage an intervention. Add in my peculiar habit of finding solace in true-crime documentaries—an activity best described as athletic only in its couch commitment—and the trajectory is clear: weight gain, sloth, entropy.

    Fortunately, I do maintain countermeasures. Kettlebells and the Schwinn Airdyne stand ready like loyal foot soldiers. Reading, writing, and piano practice also help stave off the creeping rot. And yes, I’ll continue shaving, if only to avoid becoming the bearded oracle wandering the streets muttering about glycemic index.

    This two-month hiatus is really a dress rehearsal for retirement, which is now only eighteen months and three semesters away. It would be dishonest to pretend the prospect doesn’t rattle me. Maintaining purpose without the scaffolding of a teaching schedule is its own moral test. I’m fortunate to have reached this threshold, but fortune alone won’t keep me from misusing it. All I can do is stay awake, practice discipline, and ask my Maker for the humility to spend the limited time left with intention rather than drift.

  • Gemini Has Taken Away the Mystique from ChatGPT

    Gemini Has Taken Away the Mystique from ChatGPT

    Matteo Wong’s “OpenAI Is in Trouble” reports that Gemini is crushing ChatGPT in the AI race. Marc Benioff of Salesforce spent just two hours on Gemini–all the time he needed to realize he’s leaving ChatGPT after three years. As he wrote on X: “I’ve used ChatGPT every day for 3 years. Just spent 2 hours on Gemini 3. I’m not going back. The leap is insane.” Meanwhile, a troubled Sam Altman has announced a “code red” in a memo to his employees. It appears to be a sink or swim situation. But Wong points out that this is more of a horse race with one company in the lead, then another, and then another, with frequent fluctuations. But even if ChatGPT can gain lost ground, it loses mystique. In the words of Wong: “More than ever, OpenAI seems like just another chatbot company.” 

    One possible cause of ChatGPT losing ground is its focus on commercial ventures, wanting to be “a one-stop-shop for anything” so that the platform helps you in your consumerism. Another factor is its focus on engagement, which has made ChatGPT tweaked in a way as to become a super sycophant. Wong writes: “Those tweaks, in turn, may have made some versions of ChatGPT dangerously obsequious–it has appeared to praise and reinforces some users’ darkest and most absurd ideas–and have been the subject of several lawsuits against OpenAI alleging that ChatGPT fueled delusional spirals and even, in some cases, contributed to suicide.”

    Another challenge for OpenAI is Google’s sheer size. Google can integrate Gemini into its “existing ecosystem” with billions of users. 

    I’ve been on ChatGPT for three years, impressed with it as an editing tool, and confess I have some FOMO when it comes to the current iteration of Gemini. An argument could be made that I should switch to Gemini, not just because it’s embedded in the Google Chrome that I use, but that I shouldn’t get too comfortable with one form of AI, as I have with ChatGPT, over the last three years. It might be wise to see ChatGPT less as a companion and more of a manipulating agent designed to capture my engagement so that I am serving its business interests more than my self-interests. 

    Another voice inside me, though, says Gemini will eventually do the same thing. Unless I find that Gemini will be a game-changer, in ways that ChatGPT isn’t, I suspect both should be treated cautiously: use these platforms as tools but don’t let them hijack your brain. 

  • How Luxury Spaces Produced the Last Man (college essay prompt)

    How Luxury Spaces Produced the Last Man (college essay prompt)

    Over the last two decades, American consumer spaces—from sports arenas to airport terminals—have been redesigned to prioritize comfort, insulation, curated experience, and a sense of premium belonging. These spaces promise elevated existence: velvet-rope exclusivity, controlled environments, personalized amenities, and buffers that shield patrons from inconvenience, unpredictability, or discomfort. In other words, they promise a life free from friction.

    Two recent New Yorker essays vividly capture this shift. In “How the Sports Stadium Went Luxe,” John Seabrook traces the transformation of professional sports stadiums from gritty, communal, occasionally chaotic spaces into stratified luxury environments where spectators increasingly consume the spectacle from suites, clubs, micro-environments, and upgraded “experiences” designed for a privileged few. The stadium, once a rowdy democratic gathering where masses cheered together, now resembles a branded theme park of status tiers—where the game itself recedes behind the performance of being someone who can afford to be in the right section.

    Zach Helfand’s “The Airport-Lounge Wars” extends this critique to modern travel. Airports now offer a bifurcated universe: the cramped, stressful, gate-area masses and the plush, curated lounges where passengers sip fruit-infused water under soft lighting while charging their devices and sampling “elevated” snacks. Helfand describes these lounges as “slightly better than nothing”—a telling phrase that captures the absurdity of luxury whose chief purpose is to soothe adult anxiety rather than provide meaningful enrichment. In both essays, the consumer becomes less a citizen than a carefully handled customer—shielded, pacified, and cocooned.

    This convergence of comfort, curated experience, and luxury has resulted in what many cultural critics call infantilization: the softening of the adult individual into a person who increasingly depends on structures of comfort, performs curated identity, avoids discomfort, and loses tolerance for challenge. Nietzsche warned of such a figure in Thus Spoke Zarathustra when he described the Last Man—a being who seeks comfort above all else, avoids risk, avoids conflict, avoids intensity, avoids suffering, and declares smugly, “We have invented happiness.” The Last Man lives in a society that confuses convenience with flourishing, comfort with meaning, and safety with virtue.

    Your task is to analyze how Seabrook’s and Helfand’s essays each illustrate the rise of infantilization through the growing cultural obsession with luxury, curated experience, and personal insulation. You will argue how both writers, in different contexts, reveal a society drifting toward Nietzsche’s Last Man—where people are increasingly coddled, increasingly fragile, increasingly comfort-dependent, and increasingly detached from the communal, unpredictable, and occasionally uncomfortable experiences that once defined adulthood.

    To build your argument, consider the thematic questions and analytic frameworks below. You may address several of them or focus deeply on a smaller selection, but your essay must ultimately make a clear, debatable claim about how the phenomenon of infantilization unfolds in both essays.


    1. Luxury as Surrogate Identity: The Cosplay of Importance

    Seabrook describes stadiums where spectators no longer attend to watch the game—they attend to be seen in a particular environment, to signal aura, to inhabit a curated identity. Luxury boxes, clubs, insulated corridors, private entrances, and gastronomic stations function not as amenities but as props for self-presentation. Patrons “cosplay” as elites through their seating choices. Helfand observes the same phenomenon in airport lounges: passengers use lounge access to projects status, gravitas, and “importance.” The lounge becomes a stage where individuals perform adulthood through perks.

    Analyze how luxury becomes a kind of identity cosplay. How does performance replace participation? How does curated environment become a psychological crutch for fragile egos?


    2. Comfort as a Psychological Drug

    Both essays describe environments designed to eliminate discomfort: cushioned seating, privacy, temperature-controlled rooms, abundant amenities, and curated calm. Patrons no longer tolerate cold seats, crowds, unpredictable noise, or the chaos of public life.

    In Nietzsche’s framing, this desire for frictionless existence is the defining trait of the Last Man: a person who fears intensity and pain more than insignificance.

    Examine how both essays portray comfort not as a neutral good, but as a chemical sedative—an anesthetic that dulls the senses and diminishes the human appetite for challenge.


    3. Infantilization Through Convenience and Insulation

    Helfand’s lounges function like nurseries for adults: soft lighting, soothing music, easily accessible snacks, staff catering to passengers’ needs, and gentle removal from the stressful “real world” of airports. Seabrook’s luxury stadiums behave similarly: they protect spectators from bad weather, loud crowds, long lines, and general inconvenience.

    Ask: What happens to adults who no longer encounter difficulty or discomfort in public spaces? How do these environments promote emotional regression, fragility, or dependency? How do cushioned experiences erode resilience?


    4. The Collapse of the Communal Experience

    Traditional stadiums were communal crucibles: strangers hugging after a touchdown, fans screaming in unison, unified collective identity. Luxe stadiums fracture that experience into premium sections, exclusive clubs, and tiered access.

    Airports once functioned as equalizers—everyone endured the same wait, the same lines, the same discomfort. Now, lounges separate the “important” travelers from the masses.

    How does segregation by luxury contribute to infantilization? Does comfort isolate individuals in echo chambers of curated ease? How does the decline of communal friction foster narcissism and social detachment?


    5. Emotional Labor and Passivity

    Luxury environments demand certain emotional performances: politeness, calmness, carefully managed pleasantness. In lounges, passengers adopt a soft demeanor; in stadium clubs, patrons behave with polite detachment rather than unruly fandom.

    Adults become well-behaved children: quiet, controlled, pacified.

    Discuss how both essays show the replacement of passionate, authentic emotional expression with sanitized, polite, passive behavior. How does this behavioral shift align with the Last Man’s avoidance of intensity?


    6. Tiered Access, Fragile Status, and the Anxiety of Comfort

    Both essays highlight how luxury spaces create hierarchies: VIP vs general admission, club members vs regular fans, lounge patrons vs the gate-area masses. These hierarchies foster anxiety because comfort becomes contingent on status—and status becomes fragile.

    In Nietzsche’s Last Man, community is replaced by individualistic comfort-chasing. How do tiered luxury systems cultivate insecurity, status-dependence, and infantilized anxiety?


    7. Authenticity as Inconvenience

    In both essays, authenticity of experience is subtly mocked or sidelined. The real stadium experience—mess, discomfort, unpredictability—gets replaced by cushioned sterility. The real airport experience—crowds, lines, irritation—is smoothed into a curated simulation of adult life.

    Nietzsche warned that the Last Man despises authenticity because authenticity requires discomfort.

    How do Seabrook and Helfand portray authenticity as an endangered species—and how does its absence produce infantilization?


    Write a 1,700-word comparative essay that argues:

    How and why a society obsessed with curated luxury and frictionless experience becomes an infantilized culture that resembles Nietzsche’s Last Man. John Seabrook’s “How the Sports Stadium Went Luxe” and Zach Helfand’s “The Airport-Lounge Wars” provide complementary case studies of how comfort, status-tiering, and curated identity hollow out adult resilience, diminish communal life, and normalize passivity.

    Your essay must:

    1. Develop a strong, debatable thesis about how infantilization manifests in both essays.
    2. Analyze key passages from Seabrook and Helfand with close reading.
    3. Compare how each writer critiques luxury culture through examples, tone, description, and anecdote.
    4. Incorporate Nietzsche’s concept of the Last Man as a theoretical grounding.
    5. Include a counterargument—for example, that comfort is a legitimate human good, that luxury enhances experience, or that curated spaces improve efficiency or mental health.
    6. Rebut the counterargument with evidence from the essays and your own reasoning.
    7. Conclude with broader implications—what kind of citizens does luxury culture produce? What happens to democracy, community, or adulthood when society builds padded rooms for the affluent?

    Your writing should demonstrate intellectual rigor, clarity of organization, and precise control of prose. Engage deeply with the texts. Show the reader how these essays illuminate not just consumer culture, but the deeper philosophical question Nietzsche raised: What kind of humans are we becoming?