Blog

  • No Backup World: Martin Hägglund, C.S. Lewis, and the Moral Urgency of Now

    No Backup World: Martin Hägglund, C.S. Lewis, and the Moral Urgency of Now

    Philosopher Martin Hägglund, in This Life: Secular Faith and Spiritual Freedom, advances a stark and unsettling claim: genuine goodness is impossible unless we accept that death is final. There is no afterlife to balance the books, no celestial extension cord supplying meaning from beyond the grave. This life—finite, fragile, irrevocable—is all we have. Faith in eternity, Hägglund argues, is not a comfort but a distraction, a metaphysical detour that siphons urgency away from the hard, unglamorous work of building justice here and now. To make his case, he turns to an unlikely witness: C.S. Lewis. In A Grief Observed, written after the death of his wife Joy Davidman, Lewis—Christian apologist, defender of heaven—finds his theology torn open by loss. Scripture offers no shelter. Promises of eternal reunion ring thin. Lewis admits to “bitter resentment,” to madness, to a grief so absolute that it flattens piety on contact. What he wants is not God, not eternity, not consolation—but Joy. Her absence exposes a truth Lewis cannot escape: the intensity of love is inseparable from its fragility. Love hurts because it can be lost. Its power comes from time running out. Hägglund presses the implication Lewis cannot fully accept: even if eternity existed, love could not survive there. With no stakes, no risk, no irreversibility, existence would congeal into something inert—an endless, consequence-free duration. Heaven, in this view, is not fulfillment but sedation. To imagine God as a valet who merely returns our loved ones to us is, for both Lewis and Hägglund, a form of idolatry. But where Lewis is torn—desperate to hold faith and grief in the same trembling hands—Hägglund feels no such strain. For him, religion does not deepen love; it dilutes it. It shifts responsibility elsewhere. It turns this world into a waiting room and this life into a rehearsal. Secular living, by contrast, is an act of commitment without backup plans. There is no “later” to fix what we neglect now. That is precisely why what we do here matters so much.

    If you are a political-sapien, this conclusion feels not bleak but bracing. History—not heaven—is where salvation must be worked out. There is no eternal kingdom hovering offstage, no divine reset button waiting beyond the clouds. This world is the only stage, and its outcomes depend on the quality of the institutions we build and maintain. Moral authority does not descend from above; it emerges from human reason struggling, imperfectly but persistently, toward fairness. People, in this view, are not saints or sinners by nature so much as products of systems—capable of decency when the scaffolding is sound, capable of cruelty when it is not. Politics therefore becomes the highest moral labor: not a sideshow to spiritual life but the arena in which justice either materializes or fails. AI machines enter this worldview as probationary instruments. They are not saviors and not demons. They earn trust only insofar as they distribute power downward, widen access, and reduce structural inequity. If AI flattens hierarchies and democratizes opportunity, it is a tool worth refining. If it concentrates wealth, authority, and decision-making into fewer hands, it ceases to be innovation and becomes a threat—something to regulate, constrain, or dismantle in defense of the only life that counts.

  • Pleasure Island Goes Digital: The Rise of the Hedonist-Sapien

    Pleasure Island Goes Digital: The Rise of the Hedonist-Sapien

    Eric Weiner’s The Geography of Bliss introduces one of the great cautionary silhouettes of modern travel writing: the Farang. In Thailand, the word names a particular species of Western pleasure-seeker—less tourist than residue. He is instantly legible. White. Middle-aged. Soft around the middle and hard around the eyes. His skin has that sickly, overcooked hue of someone who hasn’t slept, eaten, or hoped correctly in years. He lurches down the beach in baggy, sweat-darkened clothes, face glazed, spirit half-evacuated, as if his soul stepped out for air and never came back. What keeps him upright is not health or purpose but a wallet swollen with cash. When the money runs out, so does the illusion. He staggers home to recover, refill, and then returns to the same tropical oubliette to repeat the cycle—debauch, deplete, disappear, reload. Pleasure, in this form, isn’t joy. It’s maintenance.

    Pinocchio offers its own version of the Farang, with less sunscreen and more clarity. Linger too long on Pleasure Island and you don’t just lose your moral compass—you sprout hooves. The boy becomes a jackass. The metaphor is unsubtle and mercifully so. Pleasure without restraint doesn’t make you free; it makes you stupid, loud, and increasingly unrecognizable to yourself. World culture is thick with such warnings. Myths, fables, scriptures, novels—whole libraries exist to explain why worshiping pleasure ends badly. And yet hedonism has never been more ascendant, because unlike virtue, it scales beautifully. It drives commerce. It sells. Entire industries are built on the catechism that pleasure is the highest good and discomfort the only sin. Billions now live as functional hedonists without ever using the word, while a more self-aware elite practices a refined variant—“mindful hedonism”—spending staggering sums to curate lives of seamless comfort, aesthetic indulgence, and moral exemption. In a culture that treats pleasure as proof of success, the hedonist-sapien is not a cautionary tale; he is a lifestyle influencer.

    If you are a hedonist-sapien, your gaze turns inward, away from society’s disputes and toward the sovereign self. The highest goods are pleasure, freedom, and the smooth, uninterrupted unfolding of personal desire. Politics is static. Transcendence is optional. What matters is feeling good, living well, and sanding down every rough edge along the way. Into this worldview step the AI machines, not as ethical dilemmas but as gift baskets. They curate taste, anticipate desire, eliminate friction, and turn effort into an avoidable inconvenience. They promise a life where comfort feels earned simply by existing, and choice replaces discipline. Of the four orientations, the hedonist-sapien greets AI with the least suspicion and the widest grin, welcoming it as the perfect accomplice in the lifelong project of maximizing pleasure and minimizing resistance—Pleasure Island, now fully automated.

  • The Gospel of Optimization: Inside the Mind of the Techno-Sapien

    The Gospel of Optimization: Inside the Mind of the Techno-Sapien

    A techno-sapien is a person for whom the highest good is not pleasure, piety, justice, or even happiness, but optimization. Optimization is their religion, their aesthetic, their moral compass. They worship metrics. Steps. Biomarkers. Load times. Longevity curves. With the right stack of technologies, they want the perfect body, the frictionless mind, and—why think small?—eternal life. Death, in this worldview, is not a mystery or a boundary but a bug to be patched. Meaning and purpose are sentimental leftovers from an unoptimized age. To be fully optimized is to win, and winning, for the techno-sapien, is always zero-sum. I have the toys. I have the data. I have the escape plan. You don’t.

    On some level, they know this worldview won’t poll well. So while they preach abundance, they prepare for scarcity. They sink billions into subterranean bunkers stocked with caviar, rare vodka, cryotherapy chambers, Peloton-grade gyms, and spa lighting calibrated to simulate “circadian wellness” at the end of civilization. These aren’t shelters; they’re luxury arks—underground city-states that make Atlantis look like a Chevron off the interstate. If the world burns, they intend to ride it out at optimal humidity.

    Publicly, however, the techno-sapien sings a very different hymn. On the TED stage and at product launches, they serve a thick glaze of uplift: their latest platform will “connect humanity,” “democratize knowledge,” and “unlock new vistas of understanding.” It’s a greatest-hits album of moral aspiration, performed with a straight face. The tell, of course, is how quickly the music stops. At the first whiff of unrest—one protest, one spike in instability—they are a single remote-control click away from disappearing into the earth, leaving the rest of humanity to enjoy the togetherness from above.

  • AI as Tool, Toy, or Idol: A Taxonomy of Belief

    AI as Tool, Toy, or Idol: A Taxonomy of Belief

    Your attitude toward AI machines is not primarily technical; it is theological—whether you admit it or not. Long before you form an opinion about prompts, models, or productivity gains, you have already decided what you believe about human nature, meaning, and salvation. That orientation quietly determines whether AI strikes you as a tool, a toy, or a temptation. There are three dominant postures.

    If you are a political-sapien, you believe history is the only stage that matters and justice is the closest thing we have to salvation. There is no eternal kingdom waiting in the wings; this world is the whole play, and it must be repaired with human hands. Divine law holds no authority here—only reason, negotiation, and evolving ethical frameworks shaped by shared notions of fairness. Humans, you believe, are essentially good if the scaffolding is sound. Build the right systems and decency will follow. Politics is not mere governance; it is moral engineering. AI machines, from this view, are tools on probation. If they democratize power, flatten hierarchies, and distribute wealth more equitably, they are allies. If they concentrate power, automate inequality, or deepen asymmetry, they are villains in need of constraint or dismantling.

    If you are a hedonist-sapien, you turn away from society’s moral drama and toward the sovereign self. The highest goods are pleasure, freedom, and self-actualization. Politics is background noise; transcendence is unnecessary. Life is about feeling good, living well, and removing friction wherever possible. AI machines arrive not as a problem but as a gift—tools that streamline consumption, curate taste, and optimize comfort. They promise a smoother, more luxurious life with fewer obstacles and more options. Of the three orientations, the hedonist-sapien embraces AI with the least hesitation and the widest grin, welcoming it as the ultimate personal assistant in the lifelong project of maximizing pleasure and minimizing inconvenience.

    If you are a devotional-sapien, you begin with a darker diagnosis. Humanity is fallen, and no amount of policy reform, pleasure, or purchasing power can make it whole. You don’t expect salvation from governments, markets, or optimization schemes; you expect it only from your Maker. You may share the political-sapien’s concern for justice and enjoy the hedonist-sapien’s creature comforts, but you refuse to confuse either with redemption. You are not shopping for happiness; you are seeking restoration. Spiritual health—not efficiency—is the measure that matters. From this vantage, AI machines look less like neutral tools and more like idols-in-training: shiny substitutes promising mastery, insight, or transcendence without repentance or grace. Unsurprisingly, the devotional-sapien is the most skeptical of AI’s expanding role in human life.

    Because your orientation shapes what you think humans need most—justice, pleasure, or redemption—it also shapes how you use AI, how much you trust it, and what you expect it to deliver. Before asking what AI can do for you, it is worth asking a more dangerous question: what are you secretly hoping it will save you from?

  • Too Much RAM, Not Enough Transcendence

    Too Much RAM, Not Enough Transcendence

    At sixty-four, time no longer strolls; it sprints, and I feel myself shrinking as it passes. Not dramatically—no tragic collapse—just a steady narrowing. Fewer friends than before. A smaller social orbit. My internal clock drifting farther out of sync with my wife’s and daughter’s, who are younger, livelier, and still tuned to daylight. They love me and make heroic efforts to lure me out of my cave, but by eight o’clock I’m asleep in the back seat, hibernating like a cartoon grizzly bear who misunderstood the invitation.

    Part of the shock is how badly my expectations were mis-set. I grew up marinated in television commercials that catechized me into a childish theology of consumerism: play by the rules, buy the right things, and you’ll be lifted onto a magic carpet of perpetual happiness and glowing health. The American Dream, as advertised, looked frictionless and eternal. Paradise was a purchase away. Then generative AI arrived and supercharged the fantasy. I didn’t just get a magic carpet—I became the magic carpet. Like Superman, I could optimize myself endlessly. If immortality wasn’t on the table, surely a close approximation was.

    And yet here I am. The house is nearly paid off in a premium Southern California neighborhood. My computer has more SSD, RAM, and CPU than I could have imagined as a kid. AI tools respond instantly, obedient and tireless. And still—no glory. No transcendence. Even my healthcare provider got in on the myth, emailing me something grandly titled “Your Personal Action Plan.” I arrived at the doctor’s office expecting revelation. He handed me a cup and asked for a urine sample.

    The gap between the life I was promised by the digital age and the life I’m actually living is soul-crushing in its banality. So I retreat to a bowl of steel-cut oats, drowned in prunes, molasses, and soy milk. It’s not heroic. It’s not optimized. But it’s warm, predictable, and faintly medicinal. “At least I’m eating clean,” I tell myself—clinging to this small, beige consolation as proof that even if the magic carpet never showed up, I can still manage a decent breakfast.

    Like millions before me, I have allowed myself to fall into Optimization Afterlife Fantasy–the belief that continuous self-improvement, technological upgrades, and algorithmic assistance can indefinitely postpone decline and approximate transcendence in a secular age. It replaces older visions of salvation with dashboards, action plans, and personalized systems, promising that with enough data, discipline, and tools, one can out-optimize aging, finitude, and disappointment. The fantasy thrives on the language of efficiency and control, encouraging the illusion that mortality is a solvable design flaw rather than a human condition. When reality intrudes—through fatigue, misalignment, or the body’s quiet refusals—the fantasy collapses, leaving behind not enlightenment but a sharper awareness of limits and the hollow ache of promises made by machines that cannot carry us past time.

  • I Trained an AI Named Rocky—and Still Got Fat

    I Trained an AI Named Rocky—and Still Got Fat

    Your life, in brief, is going to bed mildly furious at yourself because you once again ate more than you meant to. Maybe twice in your entire adult existence you lay there, hands folded, whispering, “Well done, child,” as if discipline were a rare celestial event. Then you notice your friends consulting their generative AI oracles for diet wisdom, and you think, Why not me? You christen your chatbot Rocky, because nothing says accountability like a fictional personal trainer who can’t see you. Rocky obediently spits out hundreds of menus—vegan-ish, Mediterranean-leaning, low-calorie, high-protein, morally upright. You spend hours refining them, debating legumes, adjusting macros, basking in Rocky’s algorithmic approval. Rocky is proud of you. You feel productive. You feel serious.

    And yet, night after night, the same verdict arrives: you ate more than you intended to. Only now it hurts worse. Not only did you overeat, you also squandered hundreds of hours in earnest conversation with a machine that never once made you get on the exercise bike. You weren’t training—you were planning to train. You weren’t changing—you were curating the conditions under which change might someday occur. Congratulations: you’ve fallen into Optimization Displacement, the elegant self-deception in which planning replaces action and refinement masquerades as effort. Under its spell, complexity feels virtuous, engagement feels like work, and productivity theater substitutes for sweat. Optimization displacement is soothing because it offers control without discomfort, mastery without risk—but it quietly steals the time, resolve, and momentum required to do the one thing that actually works: getting up and pedaling.

    Fed up with dieting and your Rocky chatbot, you give up on your health quest and begin writing a memoir tentatively titled I Trained an AI Named Rocky–and Still Got Fat

  • The Hidden Price of Digital Purity

    The Hidden Price of Digital Purity

    Digital Asceticism is the deliberate, selective refusal of digital environments that inflame attention, distort judgment, and reward compulsive performance—while remaining just online enough to function at work or school. It is not technophobia or a monkish retreat to the woods. It is targeted abstinence. A disciplined no to platforms that mainline adrenaline, monetize approval-seeking, and encourage cognitive excess. Digital asceticism treats restraint as hygiene: a mental detox that restores proportion, quiets the nervous system, and makes sustained thought possible again. In theory, it is an act of self-preservation. In practice, it is a social provocation.

    At some point, digital abstinence becomes less a lifestyle choice than a medical necessity. You don’t vanish entirely—emails still get answered, documents still get submitted—but you excise the worst offenders. You leave the sites engineered to spike adrenaline. You step away from social platforms that convert loneliness into performance. You stop leaning on AI machines because you know your weakness: once you start, you overwrite. The prose swells, flexes, and bulges like a bodybuilder juiced beyond structural integrity. The result is a brief but genuine cleansing. Attention returns. Language slims down. The mind exhales.

    Then comes the price. Digital abstinence is never perceived as neutral. Like a vegan arriving at a barbecue clutching a frozen vegetable patty, your refusal radiates judgment whether you intend it or not. Your silence implies their noise. Your absence throws their habits into relief. You didn’t say they were living falsely—but your departure suggests it. Resentment follows. So does envy. While you were gone, people were quietly happy for you, even as they resented you. You had done what they could not: stepped away, purified, escaped.

    The real shock comes when you try to return. The welcome is chilly. People are offended that you left, because leaving forced a verdict on their behavior—and the verdict wasn’t flattering. Worse, your return depresses them. Watching you re-enter the platforms feels like watching a recovering alcoholic wander back into the liquor store. Your relapse reassures them, but it also wounds them. Digital asceticism, it turns out, is not just a personal discipline but a social rupture. Enter it carefully. Once you leave the loop, nothing about going back is simple.

  • Stir-Free Peanut Butter and the Slow Death of Self-Control

    Stir-Free Peanut Butter and the Slow Death of Self-Control

    Frictionless Consumption is the pattern by which ease replaces judgment and convenience overrides restraint. When effort is removed—no stirring, no waiting, no resistance—consumption accelerates beyond intention because nothing slows it down. What once required pause, preparation, or minor inconvenience now flows effortlessly, inviting repetition and excess. The danger is not the object itself but the vanished friction that once acted as a governor on behavior. Frictionless consumption feels like freedom in the moment, but over time it produces dependency, overuse, and decline, as appetite expands to fill the space where effort used to be. In eliminating difficulty, it quietly eliminates self-regulation, leaving users wondering how they arrived at excess when nothing ever felt like too much.

    ***

    For decades, I practiced the penitential ritual of mixing organic peanut butter. I wrapped a washcloth around a tablespoon for traction and churned as viscous globs of nut paste and brown sludge slithered up the sides of the jar. The stirring was never sufficient. No matter how heroic the effort, you always discovered fossilized peanut-butter boulders lurking at the bottom, surrounded by a moat of free-floating oil. The jar itself became slick, greasy, faintly accusatory. Still, I consoled myself with the smug glow of dietary righteousness. At least I’m natural, I thought, halo firmly in place.

    Then one day, my virtue collapsed. I sold my soul and bought Stir-Free. Its label bore the mark of the beast—additives, including the much-maligned demon, palm oil—but the first swipe across a bagel was a revelation. No stirring. No resistance. No penance. It spread effortlessly on toast, waffles, pancakes, anything foolish enough to cross its path. The only question that remained was not Is this evil? but Why did I waste decades of my life pretending the other way was better?

    The answer arrived quietly, in the form of my expanding waistline. Because peanut butter had become frictionless, I began consuming it with abandon. Spoonfuls multiplied. Servings lost their meaning. I blamed palm oil, of course—it had a face, a name, a moral odor—but the real culprit was ease. Stir-Free was not just a product; it was an invitation. When effort disappears, consumption accelerates. I didn’t gain weight because of additives. I gained weight because nothing stood between me and another effortless swipe.

    Large Language Models are Stir-Free peanut butter for the mind. They are smooth, stable, instantly gratifying, and always ready to spread. They remove the resistance from thinking, deliver fast results, and reward you with the illusion of productivity. Like Stir-Free, they invite overuse. And like Stir-Free, the cost is not immediately obvious. The more you rely on them, the more your intellectual core softens. Eventually, you’re left with a cognitive physique best described as a pencil-neck potato—bulky output, no supporting structure.

    The promise of a frictionless life is one of the great seductions of the modern age. It feels humane, efficient, enlightened. In reality, it is a trap. Friction was never the enemy; it was the brake. Remove it everywhere—food, thinking, effort, judgment—and you don’t get progress. You get collapse, neatly packaged and easy to spread.

  • Feedback Latency Intolerance

    Feedback Latency Intolerance

    Feedback Latency Intolerance is the conditioned inability to endure even brief gaps between action and response, produced by prolonged immersion in systems that reward instantaneous acknowledgment. Under its influence, ordinary delays—seconds rather than minutes—register as emotional disturbances, triggering agitation, self-doubt, or irritation disproportionate to the circumstance. The condition collapses temporal perspective, converting neutral waiting into perceived absence or rejection. What is lost is not efficiency but patience: the learned capacity to exist without immediate validation. Feedback latency intolerance reveals how algorithmic environments retrain emotional regulation, replacing mature tolerance for delay with a reflexive demand for constant confirmation.

    The extent of my deterioration revealed itself recently at a new pancake house. I took my daughter, asked the server what he actually liked on the menu, and obediently ordered the fried chicken biscuit sandwich. Then—already overplaying the moment—I texted my wife to announce my choice, as if this were actionable intelligence. I stared at my phone, waiting for the small red numeral to appear, the sacred 1 that would certify my existence. Forty seconds passed. Forty. I refreshed my screen like a lab rat pressing a lever, convinced something had gone wrong with the universe.

    In that absurd interval, it dawned on me: I had entered a state of pathological impatience, the natural byproduct of prolonged residence in the dopamine swamp of algorithmic life, where self-worth is measured by speed and volume of response. The sensation felt disturbingly familiar. My mind snapped back to stories my mother told about feeding me as a baby. The spoon, freshly loaded with mashed potatoes, would leave my mouth for a brief, necessary refill—and I would erupt in fury, unable to tolerate the unbearable injustice of the spoon’s absence. I screamed not from hunger, but from interruption. Sitting there in the pancake house, refreshing my phone, I realized I had simply upgraded the spoon. This is what too much time inside these machines does to a person: it doesn’t make you faster or smarter—it makes you an adult who can’t survive a forty-second gap between bites.

  • Stop Selling Books Like Vitamins: Reading as Pleasure, Not Duty

    Stop Selling Books Like Vitamins: Reading as Pleasure, Not Duty

    Literary Vice names the framing of reading as a private, absorbing, and mildly antisocial pleasure rather than a civic duty or self-improvement exercise. It treats books the way earlier cultures treated forbidden novels or disreputable entertainments: as experiences that tempt, distract, and pull the reader out of alignment with respectable schedules, market rhythms, and digital expectations. Literary vice rejects the language of virtue—empathy-building, résumé enhancement, democratic hygiene—and instead emphasizes immersion, obsession, and pleasure for its own sake. As a countervailing force against technology-induced anhedonia, reading works precisely because it is slow, effortful, and resistant to optimization: it restores depth of attention, reawakens desire through sustained engagement, and reintroduces emotional risk in a landscape flattened by frictionless dopamine delivery. Where screens numb by over-stimulation, literary vice revives feeling by demanding patience, solitude, and surrender to a single, uncompromising narrative consciousness.

    ***

    Adam Kirsch’s essay “Reading Is a Vice” makes a claim that sounds perverse until you realize it is completely sane: readers are misaligned with the world. They miss its rhythms, ignore its incentives, fall out of step with its market logic—and that is precisely the point. To be poorly adapted to a cultural hellscape is not a bug; it is the feature. Reading makes you antisocial in the healthiest way possible. It pulls you off screens, out of optimization mode, and away from the endless hum of performance and productivity that passes for modern life. In a culture engineered to keep us efficient, stimulated, and vaguely numb, misalignment is a form of resistance.

    Kirsch notes, of course, that reading builds critical thinking, individual flourishing, and democratic capacity. All true. All useless as marketing slogans. Those are not selling points in a dopamine economy. No one scrolls TikTok thinking, “I wish I were more civically responsible.” If you want young people to read, Kirsch argues, stop pitching books as moral medicine and start advertising them as pleasure—private, absorbing, and maybe a little disreputable. Call reading what it once was: a vice. When literature was dangerous, people couldn’t stop reading it. Now that books have been domesticated into virtue objects—edifying, wholesome, improving—no one can be persuaded to pick one up.

    You don’t eat baklava because it’s good for you. You eat it because it is an indecent miracle of sugar, butter, and culture that makes the rest of the day briefly irrelevant. Books work the same way. There are baklava books. Yours might be Danielle Steel. Mine isn’t. Mine lives closer to Cormac McCarthy. When I was in sixth grade, my literary baklava was Herman Raucher’s Summer of ’42. That book short-circuited my brain. I was so consumed by the protagonist’s doomed crush on an older woman that I refused to leave my tent for two full days during a perfect Yosemite summer. While everyone else hiked through actual paradise, I lay immobilized by narrative obsession. I regret nothing. My body was in Yosemite; my mind was somewhere far more dangerous.

    This is why you don’t tell students to read the way you tell people to take cod liver oil or hit their protein macros. That pitch fails because it is joyless and dishonest. You tell students to read because finding the right book feels like dessert—baklava, banana splits, whatever ruins your self-control. And yes, you can also tell them what Kafka knew: that great writing is an ax that breaks the frozen sea inside us. Stay frozen long enough—numb, optimized, frictionless—and you don’t just stagnate. You risk not coming back at all.