Category: Literary Dispatches

  • Stupidification Didn’t Start with AI—It Just Got Faster

    Stupidification Didn’t Start with AI—It Just Got Faster

    What if AI is just the most convenient scapegoat for America’s long-running crisis of stupidification? What if blaming chatbots is simply easier than admitting that we have been steadily accommodating our own intellectual decline? In “Stop Trying to Make the Humanities ‘Relevant,’” Thomas Chatterton Williams argues that weakness, cowardice, and a willing surrender to mediocrity—not technology alone—are the forces hollowing out higher education.

    Williams opens with a bleak inventory of the damage. Humanities departments are in permanent crisis. Enrollment is collapsing. Political hostility is draining funding. Smartphones and social media are pulverizing attention spans, even at elite schools. Students and parents increasingly question the economic value of any four-year degree, especially one rooted in comparative literature or philosophy. Into this already dire landscape enters AI, a ready-made proxy for writing instructors, discussion leaders, and tutors. Faced with this pressure, colleges grow desperate to make the humanities “relevant.”

    Desperation, however, produces bad decisions. Departments respond by accommodating shortened attention spans with excerpts instead of books, by renaming themselves with bloated, euphemistic titles like “The School of Human Expression” or “Human Narratives and Creative Expression,” as if Orwellian rebranding might conjure legitimacy out of thin air. These maneuvers are not innovations. They are cost-cutting measures in disguise. Writing, speech, film, philosophy, psychology, and communications are lumped together under a single bureaucratic umbrella—not because they belong together, but because consolidation is cheaper. It is the administrative equivalent of hospice care.

    Williams, himself a humanities professor, argues that such compromises worsen what he sees as the most dangerous threat of all: the growing belief that knowledge should be cheap, easy, and frictionless. In this worldview, learning is a commodity, not a discipline. Difficulty is treated as a design flaw.

    And of course this belief feels natural. We live in a world saturated with AI tutors, YouTube lectures, accelerated online courses, and productivity hacks promising optimization without pain. It is a brutal era—lonely, polarized, economically unforgiving—and frictionless education offers quick solace. We soothe ourselves with dashboards, streaks, shortcuts, and algorithmic reassurance. But this mindset is fundamentally at odds with the humanities, which demand slowness, struggle, and attention.

    There exists a tiny minority of people who love this struggle. They read poetry, novels, plays, and polemics with the obsessive intensity of a scientist peering into a microscope. For them, the intellectual life supplies meaning, irony, moral vocabulary, civic orientation, and a deep sense of interiority. It defines who they are. These people often teach at colleges or work on novels while pulling espresso shots at Starbucks. They are misfits. They do not align with the 95 percent of the world running on what I call the Hamster Wheel of Optimization.

    Most people are busy optimizing everything—work, school, relationships, nutrition, exercise, entertainment—because optimization feels like survival. So why wouldn’t education submit to the same logic? Why take a Shakespeare class that assigns ten plays in a language you barely understand when you can take one that assigns a single movie adaptation? One professor is labeled “out of touch,” the other “with the times.” The movie-based course leaves more time to work, to earn, to survive. The reading-heavy course feels indulgent, even irresponsible.

    This is the terrain Williams refuses to romanticize. The humanities, he argues, will always clash with a culture devoted to speed, efficiency, and frictionless existence. The task of the humanities is not to accommodate this culture but to oppose it. Their most valuable lesson is profoundly countercultural: difficulty is not a bug; it is the point.

    Interestingly, this message thrives elsewhere. Fitness and Stoic influencers preach discipline, austerity, and voluntary hardship to millions on YouTube. They have made difficulty aspirational. They sell suffering as meaning. Humanities instructors, despite possessing language and ideas, have largely failed at persuasion. Perhaps they need to sell the life of the mind with the same ferocity that fitness influencers sell cold plunges and deadlifts.

    Williams, however, offers a sobering reality check. At the start of the semester, his students are electrified by the syllabus—exploring the American Dream through Frederick Douglass and James Baldwin. The idea thrills them. The practice does not. Close reading demands effort, patience, and discomfort. Within weeks, enthusiasm fades, and students quietly outsource the labor to AI. They want the identity of intellectual rigor without submitting to its discipline.

    After forty years of teaching college writing, this pattern is painfully familiar to me. Students begin buoyant and curious. Then comes the reading. Then comes the checkout.

    Early in my career, I sustained myself on the illusion that I could shape students in my own image—cultivated irony, wit, ruthless critical thinking. I wanted them to desire those qualities and mistake my charisma as proof of their power. That fantasy lasted about a decade. Eventually, realism took over. I stopped needing them to become like me. I just wanted them to pass, transfer, get a job, and survive.

    Over time, I learned something paradoxical. Most of my students are as intelligent as I am in raw terms. They possess sharp BS detectors and despise being talked down to. They crave authenticity. And yet most of them submit to the Hamster Wheel of Optimization—not out of shallowness, but necessity. Limited time, money, and security force them onto the wheel. For me to demand a life of intellectual rigor from them often feels like Don Quixote charging a windmill: noble, theatrical, and disconnected from reality.

    Writers like Thomas Chatterton Williams are right to insist that AI is not the root cause of stupidification. The wheel would exist with or without chatbots. AI merely makes it easier to climb aboard—and makes it spin faster than ever before.

  • The Copy-Paste Generation and the Myth of the Fallen Classroom

    The Copy-Paste Generation and the Myth of the Fallen Classroom

    There is no ambiguity in Ashanty Rosario’s essay title: “I’m a High Schooler. AI Is Demolishing My Education.” If you somehow miss the point, the subtitle elbows you in the ribs: “The end of critical thinking in the classroom.” Rosario opens by confessing what every honest student now admits: she doesn’t want to cheat with AI, but the tools are everywhere, glowing like emergency exits in a burning building. Some temptations are structural.

    Her Exhibit A is a classmate who used ChatGPT to annotate Narrative of the Life of Frederick Douglass. These annotations—supposed evidence of engaged reading—were nothing more than “copy-paste edu-lard,” a caloric substitute for comprehension. Rosario’s frustration reminds me of a conversation with one of my brightest students. On the last day of class, he sat in my office and casually admitted that he uses ChatGPT to summarize all his reading. His father is a professor; he wakes up at five for soccer practice; he takes business calculus for fun. He is not a slacker. He is a time-management pragmatist surviving the 21st century. He reads the AI summaries, synthesizes them, and writes excellent essays. Of course I’d love for him to spend slow hours with books, but he is not living in 1954. He is living in a culture where time is a scarce resource, and AI is his oxygen mask.

    My daughters and their classmates face the same problem with Macbeth. Shakespeare’s language might as well be Martian for a generation raised on TikTok compression and dopamine trickle-feeds. They watch film versions of the play and use AI to decode plot points so they can answer the teacher’s study questions without sounding like they slept through the Renaissance. Some purists will howl that this is intellectual cheating. But as a writing instructor, I suspect the teacher benefits from students who at least know what’s happening—even if their knowledge comes from a chatbot. Expecting a 15-year-old to read Macbeth cold is like assigning tensor calculus to a preschooler. They haven’t done their priors. So AI becomes a prosthetic. A flotation device. A translation machine dropped into a classroom years overdue. To blame AI for the degradation of education is tempting, but it’s also lazy. We live in a society where reading is a luxury good and the leisure class quietly guards the gates.

    In the 1970s, I graduated from a public high school with literacy skills so thin you could read the room through them. I took remedial English my freshman year of college. If I were a student today, dropped into 2025 with those same deficits, I would almost certainly lean on AI just to keep my head above water. The difference is that today’s students aren’t just supplementing—they’re optimizing. They tell me this openly: over ninety percent of my students use AI because their skills don’t match the workload and because, frankly, everyone else is doing it. It’s an arms race of survival, not a moral collapse.

    Still, Rosario is right about the aftermath. She writes: “AI has softened the consequences of procrastination and led many students to avoid doing any work at all. There is little intensity anymore.” When thinking becomes optional, students drift into a kind of algorithmic sleepwalking. They outsource cognition until they resemble NPCs in a glitching video game—avatars performing human imitation rather than human thought. My colleagues and I see it, semester after semester: the fade-out, the disengagement, the slow zombification.

    Colleges are scrambling to respond. Should we police AI with plagiarism detectors? Should we ban laptops and force students to write essays in composition books under watchful eyes like parolees in a literary halfway house? Should we pretend the flood can be stopped with a beach towel?

    Reading Rosario’s lament about “cookie-cutter AI arguments,” I thought of my one visit to Applebee’s in the early 2000s. The menu photos promised ambrosia. The food tasted like something engineered in a lab to be technically edible yet spiritually vacant. Applebee’s was AI before AI—an assembly line of flavorless simulacra. Humanity gravitates toward the easy, the prepackaged, the frictionless. AI didn’t invent mediocrity. It merely handed it a megaphone.

    Rosario, clearly, is not an Applebee’s soul. She’s Michelin-level in a world eager to eat microwaved Hot Pockets. Of course her heart sinks when classmates settle for fast-food literacy. I want to tell her that if she were in high school in the 1970s, she’d still witness an appetite for shortcut learning. The tools would be different, the essays less slick, but the gravitational pull toward mediocrity would be the same. The human temptation to bypass difficulty is not technological—it’s ancestral. AI simply automates the old hunger.

  • A Slow-Motion Collapse: Reading The Emergency

    A Slow-Motion Collapse: Reading The Emergency

    George Packer’s The Emergency has been marketed as a dystopian novel. I tried to resist reading it, but after hearing Packer discuss it with Andrew Sullivan—especially the idea that democracies die not from foreign invasion but from self-inflicted wounds—I felt compelled to give it a go. The book declares its thesis on page one: The Emergency is a fading empire that decays slowly at first and then all at once. The world people once recognized disintegrates into something unthinkable. A population that once shared a common reality through the Evening Verity now lives in fractured, dopamine-soaked silos dominated by tribal influencers. The country divides into two warring classes: the educated Burghers in the cities and their resentful counterparts, the Yeomen in the hinterlands.

    In the opening chapter, this polarization erupts into “street fighting,” looting, the disappearance of law enforcement, and the flight of the ruling elite from the capital. Dr. Rustin delivers this bleak news to his family over dinner. His daughter Selva’s first concern is whether the unrest will interrupt her academic trajectory. She has worked relentlessly to climb to the top of her class, and the thought of a civil conflict jeopardizing her college prospects strikes her as the height of unfairness. In a single scene, Packer exposes the insularity of the laptop class—how they can read about national collapse yet continue to focus unblinkingly on résumé-building.

    Rustin shares his daughter’s blind spot. He believes his rationality and status shield him from whatever chaos brews outside their comfortable home, so he heads to the Imperial College Hospital as if nothing has changed. But when he arrives, he finds a skeleton staff, no leadership, and a pack of teenage looters closing in on the building, shouting about reclaiming a city stolen from them by Burghers. Their anger echoes the real-world contempt for Boomers—our generation’s hoarding of wealth, property, and opportunity, and the young’s belief that the American Dream was stolen and the ladder kicked away. The looters are led by Iver, a young man who once sat beside Selva in school. Rustin learns Iver is desperate to get medicine for his mother, who can no longer access care in the collapsing system. The gang consists of young men who failed in school and have no future—Hoffer’s True Believers in the flesh, clinging to nihilism because it’s the only story left to them.

    Their attempted looting is half-hearted; they’re too exhausted to fully ransack the hospital. Rustin placates them by promising free medical care for Iver’s mother. The moment marks a turning point for him. He once believed Burghers and Yeomen could coexist if they simply treated each other with decency, a kind of soft humanism. But Chapter One hints that civility may be dead—that the Burghers have grown complacent, valuing comfort more than democracy, drifting toward Nietzsche’s Last Man: a class so lulled by ease that it failed to maintain the institutions holding the nation together.

    It’s a bruising first chapter. As Andrew Sullivan noted, the novel “hits too close to home.” The subject matter is painful, but its resonance is undeniable. Though I haven’t been a diligent novel reader for over a decade, this one has enough voltage to keep me turning pages.

  • The Word of the Year Points to the Collective Loss of Our Minds

    The Word of the Year Points to the Collective Loss of Our Minds

    The Word of the Year is supposed to capture the moment we’re living in—our collective mood, our shared madness. As Amogh Dimri explains in “Rage Bait Is a Brilliant Word of the Year,” we’re no longer defined by reason or restraint but by whatever emotion the attention economy yanks out of us. Dimri reminds us that 2023 gave us rizz and 2024 bestowed brain rot. In other words, when our brains aren’t decomposing from endless scrolling, we’re wide awake and quivering with unhinged outrage. This may explain why I now hate driving more than folding laundry or going to the dentist. The roads are filled with people whose minds seem equal parts rotted and enraged—and the algorithms aren’t helping.

    Dimri cites the Oxford English Dictionary’s definition of rage bait as “online content deliberately designed to elicit anger” in order to goose traffic and juice engagement. An elegant description for something as crude as poking humanity’s collective bruise.

    Critics complain that Oxford’s online voting process indulges the very brain rot it warns us about, but I’m with Dimri. Oxford is right to acknowledge how digital speech shapes culture. Ignoring these terms would be like pretending smog doesn’t count as weather. Rage bait is influential because it packs the whole human condition—weakness, manipulation, and political dysfunction—into two syllables. And, as I’d add, it also produces drivers who treat the road like a demolition derby.

    As for predecessors, rage bait didn’t appear out of thin air. Vince McMahon practically drafted its blueprint decades ago. His wrestling empire ran on kayfabe, where performers wore the mask of rage so long they eventually believed it. Something similar has infected our online discourse. The performance swallowed the performer, and here we are—furious, fragmented, and algorithmically herded into traffic.

  • The Rotator Cuff, the Honda Dealership, and the Human Soul

    The Rotator Cuff, the Honda Dealership, and the Human Soul

    Life has a way of mocking our plans. You stride in with a neat blueprint, and the universe responds by flinging marbles under your feet. My shoulder rehab, for instance, was supposed to be a disciplined, daily ritual: the holy grail of recovering from a torn rotator cuff. Instead, after one enthusiastic session, both shoulders flared with the kind of throbbing soreness reserved for muscles resurrected from the dead (though after walking home from Honda, it occurred to me that my right shoulder soreness is probably the result of a tetanus shot). So much for the doctor’s handouts of broomstick rotations and wall flexions. Today, the new fitness plan is modest: drop off the Honda for service, walk two miles home, and declare that my workout. Tomorrow: to be determined by the whims of my tendons and sore muscles.

    Teaching is no different. I’ve written my entire Spring 2026 curriculum, but then I read about humanities professor Alan Jacobs—our pedagogical monk—who has ditched computers entirely. Students handwrite every assignment in composition books; they read photocopied essays with wide margins, scribbling annotations in ink. According to Jacobs, with screens removed and the “LLM demons” exorcised, students rediscover themselves as human beings. They think again. They care again. I can see the appeal. They’re no longer NPCs feeding essays into the AI maw.

    But then I remembered who I am. I’m not a parchment-and-fountain-pen professor any more than I’m a pure vegan. I am a creature of convenience, pragmatism, and modern constraints. My students live in a world of laptops, apps, and algorithms; teaching them only quills and notebooks would be like handing a medieval knight a lightsaber and insisting he fight with a broomstick. I will honor authenticity another way—through the power of my prompts, the relevance of my themes, and the personal narratives that force students to confront their own thoughts rather than outsource them. My job is to balance the human soul with the tools of the age, not to bury myself—and my students—in nostalgia cosplay.

  • Why Willpower Can’t Save You from the Snack Aisle

    Why Willpower Can’t Save You from the Snack Aisle

    After hearing something thoughtful interviews with journalist Julia Belluz and scientist Kevin Hall about their new book Food Intelligence: The Science of How Food Both Nourishes and Harms Us and KCRW food expert Evan Kleinman praise the book, I broke down and decided to see if the authors had any new insights into the exploration of what I call humans’ mismanagement of eating. The book begins on a promising note: The authors observe that in the animal kingdom, we are hard-wired with “food intelligence,” a natural-born instinct to regulate the quantity of what we eat and to target foods that our body craves for optimal nutrition. Our instinctive connection with food went haywire in the twentieth century: “Many of us started to eat too much, and the wrong things, even when we didn’t want to. Obesity rates began rising, first in rich, Western, industrialized countries such as the United States, then elsewhere.” Between 1980 and today, the obesity rate has doubled in several countries. Seventy percent of American adults and a third of U.S. children are classified as overweight or obese. Obesity-related diseases such as type 2 diabetes kills over half a million Americans a year. Obesity-related health costs are in the trillions.

    One of the major reasons for this breakdown in our instinctive hardwiring to naturally eat well is our disconnection from food: how it’s grown, produced, and cooked. We are now addicted to factory-produced fat, sugar, and salt. 

    Shaming and the gospel of self-discipline doesn’t help even though, as the authors point out, the wellness industry points an accusatory finger at our own moral shortcomings (lack of willpower, gluttony, and sloth) for our failures at weight management. The diet industries, the authors claim, are asking the wrong questions when they ask what is the best diet and how people can lose weight. For example, there are influencers who say low-carb is the best, but the authors show studies that contradict that claim. Low-carb diets are no better than low-fat ones in the long-term. The authors argue that championing the so-called ultimate diet is not the right question. Instead, the more helpful question is this: “Why do we eat what we eat?” Their obsession with answering this question is what propelled them to write the book. 

    The authors explain the problem of calories-in, calories-out as a surefire model for weight loss. The model is complicated and eventually sabotaged by the way the body reacts when we reduce calories. The metabolism slows down, we burn fewer calories doing the same exercise than we did initially, and our hunger signals rebel and scream “Eat more!” Contrary to the cheery claims of the wellness industry, eating less and exercising more usually fails within a year. 

    A more promising approach to weight management is avoiding ultra-processed foods. The more of these foods we eat, the less we are able to regulate our appetite, resulting in “a calorie glut” and weight-gain hell. But becoming food literature, replacing processed foods with whole foods, and learning to enjoy this exchange requires time and resources, which are lacking in many. Convenience and cost drive many Americans to processed food. Therefore, “the root causes” of obesity are structural. In the words of the authors: “It was never about us as individuals. Our food environment is wrecking us.” Our food environment is rewriting our brains to make us consume a calorie glut. Therefore, the food environment is making us overweight, sick, and unhappy. It is killing us. 

    Don’t consult Food Intelligence for the simple call to eat like your great-grandmother did. Even that sentiment is based on myth, the authors point out. Your great-grandmother may have spent endless hours in the kitchen exhausted while struggling “with hunger and nutrient shortfalls.” 

    One of the book’s objectives is to show how “old, unproven ideas and outdated policies continue to guide our current thinking and approaches to food.” They make it clear early on that they won’t be pushing this or that diet or even promoting “clean eating.”  If you’re looking for food puritanism, then look elsewhere. Kevin Hall admits to eating ultra-processed food and Julia Belluz admits to eating too much sugar. This book is not so much about rigid prescriptions as much as helping you change from a mindless eater to an intelligent one.   

  • The Flim-Flam Man of Higher Ed

    The Flim-Flam Man of Higher Ed

    In the summer of 2025, the English Chair—Steve, a mild-mannered, hyper-competent saint of a man—sent me an email that sounded innocuous enough. Would I, he asked, teach a freshman writing course for student-athletes? It would meet two mornings a week, two hours a session. The rest of my load would stay online. I should have known from the soft tone of his message that this was no ordinary assignment. This was a CoLab, an experimental hybrid of academic optimism and administrative wishcasting.

    The idea was elegant on paper: gather athletes into one class, surround them with counselors and coaches, raise retention rates, and call it innovation. Morale would soar. Grades would climb. The athletes would have a “safe space,” a phrase that always sounds like a promise from someone who’s never had reality punch their teeth in. Through the magic of cross-departmental communication, we’d form a “deep network of student support.” It all sounded like a TED Talk waiting to happen.

    Morning classes weren’t my preference. I usually reserved that time for my kettlebell ritual—my secular liturgy of iron and sweat—but I said yes without hesitation. Steve had earned my respect long ago. A decade earlier, we’d bonded over Dale Allison’s Night Comes, marveling at its lucidity on the afterlife. You don’t forget someone who reads eschatology with humility and enthusiasm. So when Steve asked, it felt less like a request than a summons.

    And yes, I’ll admit it: the offer flattered me. Steve knew my past as an Olympic weightlifter, the remnant coach swagger in my stride was visible even at sixty-three. I imagined myself the perfect fit—a grizzled academic with gym cred, able to command respect from linemen and linebackers. I said yes with gusto, convinced I was not just teaching a class but leading a mission.

    Soon enough, the flattery metastasized into full-blown delusion. I stalked the campus like a self-appointed messiah of pedagogy, convinced destiny had personally cc’d me on its latest memo. To anyone within earshot, I announced my divine assignment: to pilot a revolutionary experiment that would fuse intellect and biceps into one enlightened organism. I fancied myself the missing link between Socrates and Schwarzenegger—a professor forged in iron, sent to rescue education from the sterile clutches of the AI Age. My “muscular, roll-up-your-sleeves” teaching style, I told myself, would be a sweaty rebuke to all that was algorithmic, bloodless, and bland.

    The problem with self-congratulation is that it only boosts performance in the imagination. It blunts the discipline of preparation and tricks you into confusing adrenaline for authority. I wasn’t an educational pioneer—I was a man on a dopamine binge, inhaling the exhaust of my own hype. Beneath the swagger, there was no scholarship, no rigor, no plan—just the hollow hum of self-belief. I hadn’t earned a thing. Until I actually taught the class and produced results, my so-called innovation was vaporware. I was a loudmouth in faculty khakis, mistaking vanity for vocation. Until I delivered the goods, I wasn’t a trailblazer—I was the Flim-flam Man of Higher Ed, peddling inspiration on credit.

    Forgive me for being so hard on myself, but after thirty-eight years of full-time college teaching, I’ve earned the right to doubt my own effectiveness. I’ve sat in the back of other instructors’ classrooms during evaluations, watching them conduct symphonies of group discussions and peer-review sessions with the grace of social alchemists. Their students collaborate, laugh, and somehow stay on task. Mine? The moment I try anything resembling a workshop, it devolves into chatter about weekend plans, fantasy football, or the ethics of tipping baristas. A few students slink out early as if the assignment violated parole. I sit there afterward, deflated, convinced I’m the pedagogical equivalent of a restaurant that can’t get anyone to stay for dessert.

    I’ve been to professional development seminars. I’ve heard the gospel of “increasing engagement” and “active learning.” I even take notes—real ones, not the doodles of a man pretending to care. Yet I never manage to replicate their magic. Perhaps it’s because I’ve leaned too heavily on my teaching persona, the wisecracking moralist who turns outrage into a stand-up routine. My students laugh; I bask in the glow of my own wit. Then I drive home replaying the greatest hits—those sarcastic riffs that landed just right—while avoiding the inconvenient truth: humor is a sugar high. It keeps the crowd awake, but it doesn’t build muscle. Even if I’m half as funny as I think I am, comedy can easily become a sedative—a way to distract myself from the harder work of improvement.

    Measuring effectiveness in teaching is its own farce. If I sold cars, I’d know by the end of the quarter whether I was good at it. If I ran a business, profit margins would tell the story. But academia? It’s all smoke and mirrors. We talk about “retention” and “Student Learning Outcomes,” but everyone knows the game is rigged. The easiest graders pull the highest retention numbers. And when “learning outcomes” are massaged to ensure success, the data becomes a self-congratulatory illusion—a bureaucratic circle jerk masquerading as accountability.

    The current fetish is “engagement,” a buzzword that’s supposed to fix everything. We’re told to gamify, scaffold, diversify, digitize—anything to keep students from drifting into their screens. But engagement itself has become impossible to measure; it’s a ghost we chase through PowerPoint slides. My colleagues, battle-scarred veterans of equal or greater tenure, tell me engagement has fallen off a cliff. Screens have rewired attention spans, and a culture that prizes self-esteem over rigor has made deep learning feel oppressive. Asking students to revise an essay is now a microaggression.

    So yes, I question my value as an instructor. I prepare obsessively, dive deep into my essay topics, and let my passion show—because I know that if I don’t care, the students won’t either. But too often, my enthusiasm earns me smirks. To many of my students, I’m just an eccentric goofy man who takes this writing thing way too seriously. Their goal is simple: pass the class with minimal friction. The more I push them to care, the more resistance I meet, until the whole enterprise starts to feel like an arm-wrestling match.

    Until I find a cure for this malaise—a magic wand, a new pedagogy, or divine intervention—I remain skeptical of my own worth in the classroom. I do my best, but some days that feels like shouting into a void lined with smartphones. So yes, I’ll say it again for the record: I am the Flim-Flam Man of Higher Ed, hawking sincerity in an age that rewards performance.

  • The Martyrdom Industrial Complex: When Suffering Became a Writing Prompt

    The Martyrdom Industrial Complex: When Suffering Became a Writing Prompt

    In On Writing and Failure, Stephen Marche takes a blowtorch to the cult of performative victimhood that now masquerades as literary courage. He sneers at our age of “creative content,” where grievance is flexed like a gym muscle and every paper cut becomes an epic of resilience. “The writing store,” he notes mordantly, “always has victims in stock—a pile of mangled corpses, testicles and tongues slivered onto blood floors, shot at dawn for no reason.” It’s a grotesque bazaar of self-pity, and business is booming.

    Marche insists that the only genuine heroism for a writer is aesthetic integrity—the lonely grind of showing up, enduring rejection, and battering down the door through sheer persistence. He dispels the romantic myth that suffering ennobles or sanctifies creativity. There is, he reminds us, “no dignity in poverty,” and martyrdom doesn’t confer artistic credibility. Yet even he concedes that “some of the greatest works rise out of the worst horrors. Glories stroll out of burning buildings.” The difference, one suspects, lies in what the artist does with the fire, not how loudly they scream about being scorched.

    Grandiosity, not suffering, is the real contagion. Turning agony into performance is vanity with a halo. The more honest path is quieter: stoic endurance, the disciplined refusal to narrate your misery until the work itself transforms it. True writers don’t audition for sainthood—they keep typing while the room burns.

  • The Quiet Treason of Thinking Too Deeply

    The Quiet Treason of Thinking Too Deeply

    In his book On Writing and Failure, Stephen Marche doesn’t sugarcoat it: “The world does not particularly like writers.” In his view, writers are not charming dinner-party ornaments; they are historical irritants — heretics with ink-stained fingers, routinely exiled, jailed, ignored, or simply brushed into the dustbin of obscurity. And this contempt isn’t reserved for the hapless mid-lister with a Substack and a dream. It extends to the luminous few — the geniuses who write lines that shimmer like heat on desert asphalt — only to find the world shrugging, distracted by cat videos and whatever spectacle the algorithm serves next.

    Marche’s argument often reaches back to the days before Gutenberg’s press — when papyrus was a luxury and literacy a secret handshake — which suggests his “writers” are really thinkers: people who take the world apart in their heads, then rearrange it into something truer, sharper, and inevitably more threatening. Writing, he implies, isn’t just putting words on paper; it’s a habit that thickens thought until it becomes dangerous. If your writing never agitates the complacent, unnerves the powerful, or at least startles your own reflection, then you’re not writing — you’re performing a soothing ritual, like watching ASMR whispers until you drool into your pillow. That isn’t literature; it’s anesthesia.

  • The Trilemma, the Mythmaker, and the Mad Apostle

    The Trilemma, the Mythmaker, and the Mad Apostle

    C.S. Lewis is famous for the “trilemma” he poses to frame the true nature of Jesus. He argues you have three choices: Jesus is claiming to be God because he is insane. Jesus is claiming to be God but knows this claim to be untrue but says so with malevolent intent because he is devilishly dishonest. Or Jesus’ claim to be God is true. Lewis argues that the common fourth scenario is not permitted in this trilemma: You can’t say Jesus is a nice guy with wisdom that encourages all to be wise and to love each other. I call this the “Hippy Jesus” scenario. 

    While I see Lewis’ insight and honesty in not having a patronizing view of Jesus and the high-stakes claims he makes about salvation and living an abundant life, I’m not so sure the trilemma is that unique or groundbreaking. The trilemma applies to all competing religions, which make their claims to being different from their competition and the “best” of all of them. Either these religions and their advocates are crazy, cynical, or telling the truth. 

    The same goes for St. Paul. Either he was a madman, a lying cynic, or a truth-teller. 

    Reading Hyam Maccoby’s The Mythmaker: Paul and the Invention of Christianity, it’s clear that Maccoby sees Paul as both mad and cynical, a conniving narcissist with grand ambitions to head a religious movement regardless of how many people he has to step on. Much of Maccoby’s book is speculation and personal interpretation: Paul was not really a Pharisee. Paul remade Jesus from a champion for political liberation to an otherworldly figure. Jesus, a Pharisee himself, would have been offended by Paul’s notion of a divine Christ title when Jesus saw the Christ title to be a royal title, a “god-king,” that defined his Jewishness. Paul absolved the Romans from all blame for Jesus’ execution and placed it all on the Jews. The Pharisees had sympathy for Jesus and the Nazarenes in general and would not have persecuted them. This is an antisemitic myth in the New Testament designed to create a new religion based on misrepresentation. Paul’s rhetoric is so flawed that he is a hack whose epistles lack the trademark style of Pharisee training.

    The unity between Paul and the early Jerusalem church portrayed in Acts is a “sham.” The New Testament is made by authors who have given up on the Jews and are writing for a new audience–gentiles–therefore, the writings are aimed at “the anathematization of the Jews.” He argues that there is solid evidence of a competing Christianity in the first few centuries, that of the Ebionites, a theology free from the poison of Paul. 

    Maccoby’s critics have pointed out that much of the book is speculation and lacks conventional scholarly credibility. Additionally, they observe that Maccoby, ironically writing in a Pauline persona, has acrimony for Paul, builds a villain-like character, and then contorts and cherry-picks evidence and speculation to put flesh and bone on his character, who is more of a literary creation than a historical figure. In Maccoby’s view, Paul is not a truth-teller. In the context of the trilemma, Paul is a mix of a madman and conniving liar and mythmaker. 

    I have mixed feelings about Maccoby’s book. Part of me sees the speculation free of scholarly evidence and fictive elements in Maccoby’s writing, but one thing remains convincing: Christianity as a supercessionist religion. By replacing Judaism, Christianity must be looked at in terms of the trilemma: Either its writers are sincere albeit mad, they are fibbing and fabricating with a grand ambition in mind, or they are telling the truth. 

    To examine Paul in the context of the trilemma becomes most compelling in Maccoby’s final chapter, “The Mythmaker.” Maccoby writes that Paul is not so much a thinker whose writings give us definitive notions of free will, predestination, original sin, and the trinity; rather, Paul “had a religious imagination of the highest order” and is less a theologian and more of a “mythologist.” Consumed by his religious imagination, Paul was surely sincere in many of his writings. But of course the unconscious can play games on all of us. The unconscious has its own agenda to unfold wish fulfillment and satisfy deeply rooted needs for validation, love, and even power. 

    Whereas Maccoby sees Jesus as someone who wanted to fulfill his role in the Jewish religion, Paul saw Jesus differently: someone who conformed to the new religion that spun from Paul’s frenzied, often brilliant imagination. Just as Hamlet is a creation of Shakespeare, Jesus is a creation of Paul.  

    Paul has written a new story that the world has never seen in the form of a Pauline myth that is “the descent of the divine saviour.” Maccoby writes: “Everything in the so-called theology stems from this: for since salvation or rescue comes from above, no efficacy can be ascribed to the action or initiative of man.” We must abandon all other hope for the salvation of mankind and look only to the saviour who has descended to rescue us. 

    The Descending Saviour myth contains “narrative elements.” We live in a binary world of Above and Below, Light and Darkness. We live in a dark hellscape and must be rescued. The human condition is depraved. We are prisoners to sin and darkness and must be saved from the powers of Evil. We cannot, like Sam Harris, meditate and live a life of contemplation because such contemplation will cause us to surrender more to the evil inside of us. Harris’ solitary meditations may be a road to divinity for him, but for Paul, they pave a road to hell. 

    According to Maccoby, Paul’s myth causes the story about Adam and Eve’s expulsion from Paradise to be an extreme, binary view of sin that deviates from “its traditional Jewish exegesis.”  

    Paul’s extreme views cause him to see sex as a morbid affliction and he is incapable of celebrating sex as part of a fulfilling and healthy life and can only see sex through a prism of pinch-faced hostility and skepticism.  

    In Paul’s myth, Paul himself is a conduit for divine messages and visions and his writings are presented to us with the imprimatur of God. In contrast, the Old Testament is a downgrade: not written by God but curated by angels. In this comparison, Paul is superior to Judaism. In Maccoby’s view, Paul’s self-aggrandizement amounts “to wholesale usurpation of the Jewish religio-historical scheme.” 

    The Jewish way to salvation was for all of humanity to work on expunging “the evil inclination” discussed by the prophet Ezekiel. In the Pauline way, only a rescuer from above can remove this evil inclination. But Maccoby writes that the solution to sin and evil is more sophisticated and subtle than Paul can understand, perhaps because he is so absorbed by his own religious imagination. What Paul cannot understand is this: The rabbis say in the Mishnah: “Better is one hour of repentance and good works in this world than the whole life of the world to come; and better is one hour of repentance and good works in this world than the whole life of the world to come; and better is one hour of bliss in this world to come than the whole life of this world.” Such a view requires a balanced view of the human condition, but Paul, in Maccoby’s eyes, is too consumed by “adolescent despair and impatience for perfection” (Paul sounds an awful lot like me in this regard). Rabbis argue that the point of life is to struggle, and this struggle is more important than the reward. But Paul is not in this camp: “For Paul, the reward has become the indispensable substitute for the struggle, which he regards as hopeless and, therefore, pointless.”

    Maccoby rejects Paul’s salvation by faith model. You don’t just become a believer and enjoy instant salvation like Tang mixed with water. Maccoby writes: “People who are supposed to be ‘saved’ behave, unaccountably, just as badly as before they were saved, so that law has to be reintroduced to restrain them. Also, there are always logically minded people to say that if they are ‘saved,’ all behaviour that happens to appeal to them (such as sexual orgies or murder) in the confidence that nothing they do can be wrong. In other words, by being ‘saved,’ people may behave worse instead of better.” 

    According to Maccoby, Pauline’s mythmaking was born from “adolescent despair and impatience.” In his scramble to come up with a religion to satisfy his psychological needs, Paul combined Gnosticism, mystery religions of human sacrifice known as blood cults, and Judaism. These were the three major tools in Paul’s religious toolbox that he jerryrigged a new religion that would dominate the world. In borrowing from Judaism, Paul took the idea of the promises to a chosen people by making it so the Jews were no longer chosen but the gentiles. This brilliant maneuver made Christianity more appealing and marketable. 

    The most damning criticism Maccoby has of Paul’s new religion is the accusation that Paul is the chief author of antisemitism, “which eventually produced the medieval diabolization of the Jews, evinced in the stories of the ‘blood libel’ and the alleged desecration of the Host.” Paul referred to the Jews as the “sacred executioner.” He also writes that the Jews “are treated as enemies for your sake.” 

    Antisemitism is integral to the Paul’s greatest “fantasist” element of mythmaking: deifying Jesus and making his death “into a cosmic sacrifice in which the powers of evil sought to overwhelm the power of good, but, against their will, only succeeded in bringing about a salvific event. This also transforms the Jews, as Paul’s writings indicate, into the unwitting agents of salvation, whose malice in bringing about the death of Jesus is turned to good because this death is the very thing needed for the salvation of sinful mankind.” In Paul’s new religion, he showed that his mythmaking contained “an incentive to blacken the Jewish record in order to justify the Christian take-over of the Abrahamic ‘promises.’”

    Maccoby argues that Paul’s new religion has been a mixed bag: “The myth created by Paul was thus launched on its career in the world: a story that has brought mankind comfort in its despair, but has also produced plentiful evil.” 

    In this view, how do we assess the trilemma in evaluating Paul? Maccoby says Paul produced his religion out of “despair and agony,” which is to say from the torment of his inner being, a contrast to the Christian belief that Paul was animated by divine messages and visions. Paul’s “character was much more colourful than Christian piety portrays it; his real life was more like a picaresque novel than the conventional life of a saint. But out of the religious influences that jostled in his mind, he created an imaginative synthesis that, for good or ill, became the basis of Western culture.” Therefore, Paul is partly mad, a man consumed by his religious despair, and partly power-hungry, a man who seeks to create a new religion to assuage his torment and to universalize his sense of despair and salvation so the rest of the world can share in it. 

    Is Maccoby’s portrait of Paul convincing? Currently, my take is this: We have to take some of Maccoby’s judgments more seriously than others. Some narratives and psychological portrayals of Paul seem like mythmaking on Maccoby’s part. Perhaps Christianity is more complex and mysterious and less conspiratorial than Maccoby wants us to believe. But perhaps there are conflicting agendas in the making of Christianity and the Jews were unfairly portrayed. Perhaps in this regard, Maccoby is on to something and has contributed much in the way we see how religions are made and how antisemitism was born.