Blog

  • Joan Is Awful—and So Is the Lie of Digital Empowerment

    Joan Is Awful—and So Is the Lie of Digital Empowerment

    Inverse Agency

    noun

    Inverse Agency describes the moment you mistake engagement with a machine for control. You click, customize, optimize, and curate, convinced you are authoring your own life, while the real authorship happens elsewhere—in systems engineered to harvest your attention, your data, your habits, and eventually your obedience. Everything feels frictionless and flattering. The menus are intuitive. The options feel abundant. But the architecture is doing the choosing. You feel active while being managed, sovereign while being sorted. The freedom is cosmetic. The agency is staged.

    This inversion works because it seduces rather than commands. Inverse agency flatters the ego so effectively that resistance feels unnecessary, even silly. The more certain you are that you’re “in charge,” the less likely you are to notice how your desires are being nudged, your behaviors rewarded, and your choices quietly narrowed. Power doesn’t bark orders; it smiles and asks permission. The result is a system that governs best by convincing its subjects they are free.

    No recent pop artifact captures this better than the Black Mirror episode “Joan Is Awful.” Joan believes she is finally taking control of her life through self-reinvention, only to discover she has handed herself over to an algorithm that strips her identity for parts and streams the wreckage as entertainment. The episode skewers the central lie of platform culture: that self-optimization equals self-authorship. Joan isn’t liberated; she’s monetized. Her personality becomes content, her choices become inputs, and her autonomy dissolves into a subscription model. What looks like empowerment is actually self-erasure with better lighting.

  • How We Declawed a Generation in the Name of Comfort

    How We Declawed a Generation in the Name of Comfort

    Declawing Effect

    noun
    The gradual and often well-intentioned removal of students’ essential capacities for self-defense, adaptation, and independence through frictionless educational practices. Like a declawed cat rendered safe for furniture but vulnerable to the world, students subjected to shortened readings, over-accommodation, constant mediation, and AI-assisted outsourcing lose the intellectual and emotional “claws” they need to navigate uncertainty, conflict, and failure. Within protected academic environments they appear functional, even successful, yet outside those systems they are ill-equipped for unscripted work, sustained effort, and genuine human connection. The Declawing Effect names not laziness or deficiency, but a structural form of educational harm in which comfort and convenience quietly replace resilience and agency.

    ***

    In the early ’90s, a young neighbor told me that the widow behind his condo was distraught. Her cat had gone missing, and she didn’t expect it to come back alive. When I asked why, he explained—almost casually—that the cat had been declawed. He then described the procedure: not trimming, not blunting, but amputating the claws down to the bone. I remember feeling a cold, visceral recoil. People do this, he said, to protect their furniture—because cats will shred upholstery to sharpen the very tools evolution gave them to survive. The logic is neat, domestic, and monstrously shortsighted. A declawed cat may be well behaved indoors, but the moment it slips outside, it becomes defenseless. The price of comfort is vulnerability.

    Teaching in the age of AI, I can’t stop thinking about that cat. My students, too, are being declawed—educationally. They grow up inside a frictionless system: everything mediated by screens, readings trimmed to a few pages, tests softened or bypassed through an ever-expanding regime of accommodations, and thinking itself outsourced to AI machines that research, write, revise, and even flirt on their behalf. Inside education’s Frictionless Dome, they function smoothly. They submit polished work. They comply. But like declawed cats, they are maladapted for life outside the enclosure: a volatile job market, a world without prompts, a long hike navigated by a compass instead of GPS, a messy, unscripted conversation with another human being where nothing is optimized and no script is provided. They have been made safe for institutional furniture, not for reality. And here’s the part that unsettles me most: I work inside the Dome. I enforce its rules. I worry that in trying to keep students comfortable, I’ve become an enabler—a quiet accomplice in reinforcing the Declawing Effect.

  • AI Machines Spell the End of the Bulletproof Syllabus

    AI Machines Spell the End of the Bulletproof Syllabus

    Instructional Humiliation
    noun

    The quiet, destabilizing experience instructors undergo when long-held authority, confidence, and pedagogical certainty collapse in the face of rapid technological change. Instructional humiliation is not the exposure of incompetence but the painful honesty of recognizing that one’s maps no longer match the terrain—often in full view of students who expect guidance. It arises when teachers must admit uncertainty after years of standing as figures of clarity and command, producing a sense of personal diminishment even as it reflects intellectual integrity. Unlike shame rooted in failure, instructional humiliation emerges from ethical transparency: the collision between professional identity and the unsettling reality that the future of education is no longer fully knowable.

    To be an effective instructor, you are expected to project confidence the way a seasoned closer projects certainty across a conference table. You stand before the class with assurance, boldness, and the practiced enthusiasm of a noble salesperson pitching ideas you genuinely believe will improve students’ lives—on the small scale of Student Learning Outcomes and on the larger scale of becoming people who can think, question, and locate their purpose in a chaotic world. You draw from a deep well of literature, history, politics, language, philosophy, geography, and religion, and that accumulated knowledge gives your voice weight. Rectitude is your currency. Students arrive disoriented and unformed; you provide structure, coherence, and a map. In a culture gone mad, you are supposed to be the adult in the room.

    Then AI machines arrive and quietly detonate the set. In recent years, they’ve scrambled the rules of higher education so thoroughly that honesty itself becomes destabilizing. Where you once spoke with earned certainty, you now have to admit—if you’re not lying—that you don’t fully know what comes next. You’re flummoxed. You’re standing at the gates of the unknown with no laminated syllabus for what’s on the other side. Your role feels provisional. Your plan no longer feels bulletproof. Your confidence collapses into something closer to agnosticism, and that shift carries a sting that feels suspiciously like humiliation. And yet you keep going. You walk your students into the fog not because you are heroic or noble, but because stopping is not an option. Clarity may no longer be guaranteed, but the struggle to wrestle meaning from the darkness remains the job.

  • Reproductive Incentive Conflict: Why College Rewards Appearances Over Depth

    Reproductive Incentive Conflict: Why College Rewards Appearances Over Depth

    Reproductive Incentive Conflict
    noun

    The tension that arises when the pursuit of long-term intellectual depth, integrity, and mastery competes with the immediate pressures of achieving economic and social status tied to reproductive success. Reproductive incentive conflict is most acute in environments like college, where young men intuit—often correctly—that mating markets reward visible outcomes such as income, confidence, and efficiency more reliably than invisible virtues like depth or craftsmanship. In such contexts, Deep Work offers no guaranteed conversion into status, while shortcuts, system-gaming, and AI-assisted performance promise faster, more legible returns. The conflict is not moral confusion but strategic strain: a choice between becoming excellent slowly or appearing successful quickly, with real social and reproductive consequences attached to each path.

    Chris Rock once sliced through the romance of meritocracy with a single joke about reproductive economics. If Beyoncé were working the fry station at McDonald’s, her attractiveness alone would not disqualify her from marrying Jay-Z. But reverse the roles—put Jay-Z in a paper hat handing out Happy Meals—and the fantasy collapses. The point is crude but accurate: in the mating market, men are judged less on raw appeal than on status, income, and visible competence. A man has to become something before he is considered desirable. It’s no mystery, then, why a young man entering college quietly factors reproductive success into his motivation. Grades aren’t just grades; they’re potential leverage in a future economy of attraction.

    Here’s where Cal Newport’s vision collides with reality. Newport urges Deep Work—slow, demanding, integrity-driven labor that resists shortcuts and defies easy metrics. Deep Work builds character and mastery, but it offers no guaranteed payout. It may lead to financial success, or it may not. Meanwhile, the student who bypasses depth with AI tools can often game the system, generating polished outputs and efficient performances that read as competence without the grind. The Deep Worker toils in obscurity while the system-gamer cashes visible wins. This creates a genuine tension: between becoming excellent in ways that compound slowly and appearing successful in ways that signal immediately. It’s not a failure of virtue; it’s a collision between two economies—one that rewards depth, and one that rewards display—and young men feel the pressure of that collision every time they open a laptop.

  • The Automated Pedagogy Loop Could Threaten the Very Existence of College

    The Automated Pedagogy Loop Could Threaten the Very Existence of College

    Automated Pedagogy Loop
    noun

    A closed educational system in which artificial intelligence generates student work and artificial intelligence evaluates it, leaving human authorship and judgment functionally absent. Within this loop, instructors act as system administrators rather than teachers, and students become prompt operators rather than thinkers. The process sustains the appearance of instruction—assignments are submitted, feedback is returned, grades are issued—without producing learning, insight, or intellectual growth. Because the loop rewards speed, compliance, and efficiency over struggle and understanding, it deepens academic nihilism rather than resolving it, normalizing a machine-to-machine exchange that quietly empties education of meaning.

    The darker implication is that the automated pedagogy loop aligns disturbingly well with the economic logic of higher education as a business. Colleges are under constant pressure to scale, reduce labor costs, standardize outcomes, and minimize friction for “customers.” A system in which machines generate coursework and machines evaluate it is not a bug in that model but a feature: it promises efficiency, throughput, and administrative neatness. Human judgment is expensive, slow, and legally risky; AI is fast, consistent, and endlessly patient. Once education is framed as a service to be delivered rather than a formation to be endured, the automated pedagogy loop becomes difficult to dislodge, not because it works educationally, but because it works financially. Breaking the loop would require institutions to reassert values—depth, difficulty, human presence—that resist optimization and cannot be neatly monetized. And that is a hard sell in a system that increasingly rewards anything that looks like learning as long as it can be scaled, automated, and invoiced.

    If colleges allow themselves to slide from places that cultivate intellect into credential factories issuing increasingly fraudulent degrees, their embrace of the automated pedagogy loop may ultimately hasten their collapse rather than secure their future. Degrees derive their value not from the efficiency of their production but from the difficulty and transformation they once signified. When employers, graduate programs, and the public begin to recognize that coursework is written by machines and evaluated by machines, the credential loses its signaling power. What remains is a costly piece of paper detached from demonstrated ability. In capitulating to automation, institutions risk hollowing out the very scarcity that justifies their existence. A university that no longer insists on human thought, struggle, and judgment offers nothing that cannot be replicated more cheaply elsewhere. In that scenario, AI does not merely disrupt higher education—it exposes its emptiness, and markets are ruthless with empty products.

  • How to Resist Academic Nihilism

    How to Resist Academic Nihilism

    Academic Nihilism and Academic Rejuvenation

    Academic Nihilism names the moment when college instructors recognize—often with a sinking feeling—that the conditions students need to thrive are perfectly misaligned with the conditions they actually inhabit. Students need solitude, friction, deep reading and writing, and the slow burn of intellectual curiosity. What they get instead is a reward system that celebrates the surrender of agency to AI machines; peer pressure to eliminate effort; and a hypercompetitive, zero-sum academic culture where survival matters more than understanding. Time scarcity all but forces students to offload thinking to tools that generate pages while quietly draining cognitive stamina. Add years of screen-saturated distraction and a near-total deprivation of deep reading during formative stages, and you end up with students who lack the literacy baseline to engage meaningfully with writing prompts—or even to use AI well. When instructors capitulate to this reality, they cease being teachers in any meaningful sense. They become functionaries who comply with institutional “AI literacy policies,” which increasingly translate to a white-flag admission: we give up. Students submit AI-generated work; instructors “assess” it with AI tools; and the loop closes in a fog of futility. The emptiness of the exchange doesn’t resolve Academic Nihilism—it seals it shut.

    The only alternative is resistance—something closer to Academic Rejuvenation. That resistance begins with a deliberate reintroduction of friction. Instructors must design moments that demand full human presence: oral presentations, performances, and live writing tasks that deny students the luxury of hiding behind a machine. Solitude must be treated as a scarce but essential resource, to be rationed intentionally—sometimes as little as a protected half-hour of in-class writing can feel revolutionary. Curiosity must be reawakened by tethering coursework to the human condition itself. And here the line is bright: if you believe life is a low-stakes, nihilistic affair summed up by a faded 1980s slogan—“Life’s a bitch; then you die”—you are probably in the wrong profession. But if you believe human lives can either wither into Gollumification or rise toward higher purpose, and you are willing to let that belief inform your teaching, then Academic Rejuvenation is still possible. Even in the age of AI machines.

  • Pretending to be Busy at Work: Hustle Theater

    Pretending to be Busy at Work: Hustle Theater

    Hustle Theater

    noun

    In Deep Work, Cal Newport issues a quiet but devastating warning: busyness is often nothing more than productivity in drag. Motion stands in for meaning. The inbox fills, the dashboards glow, the machine hums—and we feel virtuous, even noble, as if all this activity signals progress and moral seriousness. In reality, much of this labor consists of mindlessly feeding tasks to machines and mistaking their output for our own achievement. Busyness, in this sense, is a kind of workplace cosplay—a performance of importance rather than its substance.

    Call it Hustle Theater: a nonstop public display of motion designed to broadcast diligence and relevance. Hustle Theater prizes visibility over value, responsiveness over results. It keeps people feverishly active while sparing them the discomfort of doing work that actually matters. The show is convincing. The performers are exhausted. And the audience—often including the performers themselves—applauds wildly, unaware that nothing of consequence has taken place.

  • Carl Jung’s Bollingen Tower Represents Our Sanctuary for Deep Work

    Carl Jung’s Bollingen Tower Represents Our Sanctuary for Deep Work

    Bollingen Principle

    noun
    The principle that original, meaningful work requires a deliberately constructed refuge from distraction. Named after Carl Jung’s Bollingen Tower, the Bollingen Principle holds that depth does not emerge from convenience or connectivity, but from environments intentionally designed to protect sustained thought, solitude, and intellectual risk. Such spaces—whether physical, temporal, or psychological—function as sanctuaries where the mind can operate at full depth, free from the pressures of immediacy and performance. The principle rejects the idea that creativity can flourish amid constant interruption, insisting instead that those who seek to do work that matters must first build the conditions that allow thinking itself to breathe.

    ***

    In an age saturated with technological distraction and constant talk of “disruption” and AI-driven upheaval, it is easy to lose sight of one’s personal mission. That mission is a North Star—a purpose that orients work, effort, and flourishing. It cannot be assigned by an employer, an algorithm, or a cultural trend. It must be discovered. As Viktor Frankl argues in Man’s Search for Meaning, you do not choose meaning at will; life chooses it for you, or rather, life discloses meaning to you. The task, then, is attentiveness: to look and listen carefully to one’s particular circumstances, abilities, and obligations in order to discern what life is asking of you.

    Discerning that mission requires depth, not shallowness. Cal Newport’s central claim in Deep Work is that depth is impossible in a state of constant distraction. A meaningful life therefore demands the active rejection of shallow habits and the deliberate cultivation of sustained focus. This often requires solitude—or at minimum, long stretches of the day protected from interruption. Newport points to Carl Jung as a model. When Jung sought to transform psychiatry, he built Bollingen Tower, a retreat designed to preserve his capacity for deep thought. That environment enabled work of such originality and power that it reshaped an entire field.

    Jung’s example reveals two essential conditions for depth: a guiding ideal larger than comfort or instant gratification, and an environment structured to defend attention. To avoid a shallow life and pursue a meaningful one, we must practice the same discipline. We must listen for our own North Star as it emerges from our lives, and then build our own version of Bollingen Tower—physical, temporal, or psychological—so that we can do the work that gives our lives coherence and meaning.

  • Planning Focus Like a Bodybuilder Plans Calories

    Planning Focus Like a Bodybuilder Plans Calories

    Shallow Work Containment
    noun

    A strategy for managing unavoidable low-value tasks by strictly rationing their time and scope, much like the points system used in Weight Watchers. In this model, shallow work—email, scheduling, administrative triage—is not banned, but it is counted, budgeted, and contained within clearly defined limits. Just as Weight Watchers assigns point values to foods to prevent mindless grazing, shallow work containment treats distractions as cognitively “expensive,” forcing the worker to spend them deliberately rather than impulsively. The goal is not moral purity but control: by acknowledging that these tasks add up quickly, containment preserves the majority of cognitive “calories” for deep work, where real progress is made.

    ***

    As both a champion and a practitioner of Deep Work, Cal Newport is a model citizen of Shallow Work Containment. He doesn’t flirt with distraction; he bars it at the door. He has never had a Facebook or Twitter account, and outside of his own blog he avoids social media altogether. He doesn’t wander the web or graze on online articles. For news, he does something that now sounds faintly radical: he reads a physical copy of The Washington Post delivered to his house and listens to NPR. By surrounding himself with a protective moat against distraction invaders, Newport has, over the past decade, published four books, earned a PhD, and generally made a nuisance of himself to the myth that constant connectivity is a prerequisite for relevance.

    Newport treats technology the way serious physical culturists treat food: as something to be managed, not indulged. There is no such thing as “random” consumption. You don’t wake up and see how the day feels. You plan. You prohibit. You decide in advance what gets in and what stays out. Random scrolling is the cognitive equivalent of eating straight from the peanut butter jar. In Newport’s own formulation, his days are built around a protected core of deep work, with the shallow tasks he cannot avoid quarantined into small, contained bursts at the edges of his schedule. Three to four hours a day, five days a week, of uninterrupted, carefully directed concentration—nothing heroic, just disciplined—turns out to be enough to produce serious value. There’s no guesswork here. Newport does the math and follows it. Like any disciplined lifter or dieter, he hits his macros.

  • Modernity Signaling: How Looking Current Makes You Replaceable

    Modernity Signaling: How Looking Current Makes You Replaceable

    Modernity Signaling

    noun
    The practice of performing relevance through visible engagement with contemporary tools rather than through demonstrable skill or depth of thought. Modernity signaling occurs when individuals adopt platforms, workflows, and technologies not because they improve judgment or output, but because they signal that one is current, adaptable, and aligned with the present moment. The behavior prizes speed, connectivity, and responsiveness as markers of sophistication, while quietly sidelining sustained focus and original thinking as outdated or impractical. In this way, modernity signaling mistakes novelty for progress and technological proximity for competence, leaving its practitioners busy, replaceable, and convinced they are advancing.

    ***

    As Cal Newport makes his case for Deep Work—the kind of sustained, unbroken concentration that withers the moment email, notifications, and office tools start barking for attention—he knows exactly what’s coming. Eye rolls. Scoffing. The charge that this is all terribly quaint, a monkish fantasy for people who don’t understand the modern workplace. “This is the world now,” his critics insist. “Stop pretending we can work without digital tools.” Newport doesn’t flinch. He counters with a colder, more unsettling claim: in an information economy drowning in distraction, deep work will only grow more valuable as it becomes more rare. Scarcity, in this case, is the point.

    To win that argument, Newport has to puncture the spell cast by our tools. He has to persuade people to stop being so easily dazzled by dashboards, platforms, and AI assistants that promise productivity while quietly siphoning attention. These tools don’t make us modern or indispensable; they make us interchangeable. What looks like relevance is often just compliance dressed up in sleek interfaces. The performance has a name: Modernity Signaling—the habit of advertising one’s up-to-dateness through constant digital engagement, regardless of whether any real thinking is happening. Modernity signaling rewards appearance over ability, motion over mastery. When technology becomes a shiny object we can’t stop admiring, it doesn’t just distract us; it blinds us. And in that blindness, we help speed along our own replacement, congratulating ourselves the whole way down.