Author: Jeffrey McMahon

  • Dental Work Without Anesthesia (and With Delusions of Bravery)

    Dental Work Without Anesthesia (and With Delusions of Bravery)

    This afternoon my dentist of twenty-two years—practically a family archivist of my molars—announced, with the calm authority of a man who has seen too much enamel decay, “Let’s handle this now before your roots start making public appearances.” Moments later, he was layering composite over two bottom teeth that had clearly lived a full and reckless life.

    No anesthesia. Apparently this was to be a character-building exercise.

    The procedure itself was a test of endurance disguised as routine maintenance. A styrofoam block was wedged into my mouth, prying it open to a degree that felt less medical and more architectural, as if my jaw were being renovated from the inside. Swallowing became a distant memory. Breathing required strategy. Time slowed to a crawl. For a Grade-3 claustrophobe, it was less a dental visit and more a hostage situation with excellent lighting.

    Was I brave? Hardly. Internally, I was negotiating terms of surrender.

    But outwardly—this is important—I looked composed. Even heroic. On my wrist sat my G-Shock Frogman GWF-1000, a hulking instrument of indifference to human weakness, radiating the sort of rugged competence I was very much not feeling. If courage is partly theater, then I delivered a convincing performance.

  • Teaching Without a Map in a Shifting World

    Teaching Without a Map in a Shifting World

    In “Teaching in an American University Is Very Strange Right Now,” Frank Bruni captures a tension that defines the modern classroom: how do you offer students hope without lying to them? His students at Duke University are coming of age in a moment that feels less like a transition and more like a rupture—truth is contested, institutions feel unstable, artificial intelligence is reshaping entire professions, and the political climate leans toward confusion and consolidation of power. Add to that a job market where careers in public policy, government, and nonprofits are shrinking, and the traditional pathways begin to look like dead ends.

    Bruni’s difficulty is not just emotional; it’s epistemological. The ground keeps shifting. The job market no longer behaves like a map you can study and memorize. It behaves like weather—volatile, unpredictable, and indifferent to your plans. Bruni and his colleagues find themselves in an unfamiliar position: experts who no longer trust their own expertise. When the mentors are unsure of the terrain, the act of mentoring starts to feel like guesswork dressed up as guidance.

    And yet, retreating into cynicism would be a dereliction of duty. Bruni insists on offering hope, but not the anesthetized version that avoids discomfort. His version of hope is anchored in reality. He tells his students that survival in this environment will not belong to the most credentialed or the most specialized, but to the most adaptable. The winners will be those who can pivot quickly, read patterns early, and anticipate what’s coming before it arrives. In other words, they must think several moves ahead while the board itself is being rearranged.

    This requires a shift in how students approach their education. The old model—bury yourself in your major, master the material, trust that the system will reward you—was always a partial truth. Now it’s a liability. Depth without awareness is no longer enough. Students need a wide-angle lens, an ongoing scan of the broader landscape: economic shifts, technological disruptions, political currents. The classroom can no longer be a refuge from the world; it has to be a vantage point from which to read it.

    Bruni’s message is unsettling, but it has the virtue of being honest. Hope, in this context, is not the promise that things will work out as planned. It’s the conviction that those who stay alert, flexible, and strategically aware can still find a way forward—even when the path refuses to stay still.

  • The Kryptonite Effect of Screens in Education

    The Kryptonite Effect of Screens in Education

    In her Atlantic essay “What Happened After a Teacher Ditched Screens,” the author examines a belief so widely accepted it rarely gets questioned: that more technology automatically improves learning. Dylan Kane, a seventh-grade math teacher, bought into that belief for over a decade. His students worked on Chromebooks, navigating a custom-built math site while monitoring software kept them from drifting into games or distractions. It was a tightly managed digital ecosystem—efficient on paper, persuasive in theory.

    Then Kane pulled the plug.

    This wasn’t a minor adjustment; it was a small act of rebellion. Nearly ninety percent of school districts now issue laptops or tablets, sold on the promise of “personalization”—the idea that technology can tailor instruction to each student’s needs, close learning gaps, and adapt to different cognitive styles. It’s an elegant theory, especially attractive to those whose reputations and revenue depend on merging education with technology.

    But in Kane’s classroom, the theory collapsed under the weight of actual human behavior. Screens didn’t personalize learning; they colonized attention. Students stared at them the way gamblers stare at slot machines—fixed, hypnotized, and detached from the room. Class discussion withered. The teacher’s voice, once the organizing force of the classroom, lost every round to the glowing rectangle. When attention becomes a zero-sum game, the screen doesn’t negotiate. It wins.

    Kane’s frustration deepened when he read Jared Cooney Horvath’s The Digital Delusion, which argues that increased technology use correlates with declining student performance. So Kane ran an experiment: he removed the Chromebooks for a month. What he discovered was not subtle. Students began paying attention again. Participation returned. Assignment completion jumped from 45 to 62 percent. Writing equations by hand—slow, deliberate, mildly inconvenient—forced students to see their own thinking unfold. The inconvenience turned out to be the point. Learning, it seems, benefits from friction.

    I’ve been teaching college writing for over thirty-five years, and I’ve seen my own version of this “kryptonite effect.” Smartphones siphon attention. Laptops become portals to games, sports, and anything but the task at hand. I’ve watched students drift out of the room without leaving their seats. The screen doesn’t just distract; it competes, and it usually wins.

    And yet, my experience isn’t a simple indictment of technology. Between 2018 and 2019, I ran a structure that worked. We met twice a week: one day for lecture and discussion, the other as a writing lab. During lab sessions, students wrote on desktops or their own laptops, working through scaffolded assignments. I read their drafts in real time, helping them revise thesis statements and sharpen arguments. The dynamic shifted. I wasn’t a distant lecturer; I was a coach moving from desk to desk. Students completed work on campus instead of procrastinating at home. Completion rates improved, not because of the machines themselves, but because of how they were used.

    The pandemic ended that model. My courses shifted to a hybrid format—one meeting a week—and the lab disappeared. I’ve been reluctant to surrender precious face-to-face time to silent writing sessions. But I’m beginning to wonder if I’ve been too cautious. If Kane is right about the power of attention, perhaps the most effective use of class time is not more talking, but more doing.

    What Kane’s experiment ultimately reveals is not that technology is useless, but that it is context-dependent. A math classroom, built on sequential problem-solving, may suffer when screens fracture attention. A writing classroom, structured around drafting and revision, may benefit from them under the right conditions. The mistake is not using technology. The mistake is treating it as a universal solution.

    If I were back to teaching two days a week, I wouldn’t hesitate. One day for discussion. One day for writing in a lab. Not because technology is inherently good, but because, in that setting, it serves the work instead of sabotaging it.

  • The Vegan Diet That Actually Behaves

    The Vegan Diet That Actually Behaves

    Most vegan diets chase variety. This one chases something else: predictability. I wanted a plan that supports gut health, delivers about 150 grams of protein, and stays around 2,300 calories—without turning every meal into a digestive gamble. The result is not a celebration of abundance. It’s a system that behaves.

    The guiding idea is simple. Every meal is built from three parts: a stable starch, a low-residue protein, and a measured dose of fiber. The aim is not to flood the gut with “healthy” inputs, but to give it clear, consistent instructions.

    Breakfast is structured but quiet. I start with well-cooked buckwheat groats—soft enough to digest without resistance. Into that goes a scoop of pea-and-rice protein powder, half a banana, a teaspoon of psyllium husk, and a small pour of unsweetened soy milk. It’s not exciting, but it is dependable. The psyllium adds just enough cohesion, the banana binds, and the protein arrives without the usual legume side effects.

    Lunch simplifies things even further. Oatmeal becomes the base—again, in a controlled portion. I add another scoop of protein powder, then rotate between half a banana and a small serving of applesauce. A modest amount of soy milk smooths it out. That’s it. No stacking of proteins, no fiber fireworks. Lunch is designed to send a single, clear signal to the body: digest, don’t negotiate.

    Dinner does the heavy lifting. This is the anchor meal, the one that determines how the next morning unfolds. A plate of white rice and red potatoes forms the foundation—arguably the most reliable pairing for digestive stability. On top of that, I add about six ounces of extra-firm tofu and a side of sautéed zucchini or carrots. Everything is cooked soft. Everything is deliberate. A tablespoon of olive oil finishes the plate, not for indulgence, but for smooth passage.

    If I need something at night, I keep it controlled: half a banana, a tablespoon of peanut butter, and a small glass of soy milk. Enough to take the edge off, not enough to start a second digestive act.

    Across the day, the numbers line up: roughly 2,300 calories, about 150 grams of protein, and a moderate fiber intake that stays in the zone where things hold together instead of falling apart. The real achievement, though, isn’t the macros—it’s the consistency. Meals repeat. Ingredients overlap. The system stabilizes.

    There are rules. Beans and lentils are out as daily staples—not because they’re unhealthy, but because they introduce too much variability. Raw vegetables are unnecessary friction. Fiber is measured, not celebrated. Variety is limited on purpose. This is a diet built on the belief that clarity beats complexity.

    Is it boring? Often. But boredom, in this context, is a kind of luxury. It means nothing is going wrong. It means your body is no longer improvising. It means the system is working.

    Can I sustain my health and muscle on a plant-based diet? What if I feel weak? To be honest, I have two contingency plans: I may have to add a scoop of Greek yogurt a day, and replace the vegan protein powder with whey protein powder. That will be the tentative part of the journey. 

  • Why the Word “Stress” Has Outlived Its Usefulness

    Why the Word “Stress” Has Outlived Its Usefulness

    The word stress has been talked into exhaustion. It shows up everywhere—therapy sessions, productivity podcasts, corporate memos—until it becomes a kind of verbal white noise. Everything is stressful. Traffic is stressful. Email is stressful. Existence itself is apparently one long panic attack. The result is not clarity but numbness. A word that once pointed to something real now floats, bloated and imprecise, over every inconvenience and calamity alike. It needs to be stripped down, cleaned up, and returned to service.

    Start by dividing what we lazily call “stress” into three distinct experiences.

    First, there is what we might call existential friction—the strain that comes from living a life that actually matters. Sartre described it as getting your hands dirty. It is the tension of responsibility, of choosing action over comfort. Think of Viktor Frankl, who could have escaped a concentration camp but stayed to tend to the suffering around him. To say he was “stressed” is to trivialize the moment. He was engaged in a moral confrontation with evil. The discomfort was not a malfunction; it was the price of meaning. A bodybuilder tears muscle to grow stronger. A moral person strains against life’s conflicts to become more fully human. This is not pathology. It is construction.

    Second, there is narcissistic agitation—the counterfeit version of stress, self-generated and corrosive. This is the anxiety of the addict chasing relief, the restless paranoia of the status-obsessed, the brittle ego that reads every room as a threat. Here the individual is both the engine and the victim of the distress. It is not the friction of purpose but the turbulence of misalignment. To confuse this with existential friction is not just sloppy; it is morally obtuse. One builds character. The other erodes it.

    Finally, there is existential overload—the strain that arrives uninvited and exceeds your capacity to absorb it. This is not heroic and not self-inflicted. It is what happens when life stacks too many weights on the bar at once. Divorce, illness, financial collapse—events that don’t ask for your permission before they rearrange your nervous system. In this state, the body begins to narrate what the mind cannot contain. Appetite disappears. Sleep fractures. Symptoms bloom. There is no lesson neatly packaged inside it, no redemptive arc guaranteed. It is endured, not chosen.

    I think of my brother in 2020. His marriage collapsed. He was suddenly alone during the pandemic, financially strained, disoriented. Then came the diagnosis: Burkitt lymphoma. Two months to live. That is not “stress.” That is existential overload in its purest form. And yet, against those odds, he found a narrow corridor of hope—a CAR T-cell therapy trial at UCSF. He took it. He survived. He is in remission. The word stress does not belong anywhere near that story.

    This is why the word needs to be retired from serious use. It flattens distinctions that matter. It places the inconvenience of a crowded inbox on the same plane as a confrontation with mortality. Better to replace it with terms that carry weight: existential friction, narcissistic agitation, existential overload. Precision is not pedantry; it is navigation. If you’re trying to find your way out of the dark, you don’t need a vague feeling. You need a compass that actually points somewhere.

  • Social Capital and the Art of Not Being Chosen

    Social Capital and the Art of Not Being Chosen

    Not all rejection deserves to be filed under the same heading. Romantic rejection—the operatic kind—arrives with violins, moonlight, and a certain built-in alibi. You fall hard, you overestimate your odds, and when the other person declines to co-star in your fantasy, you can console yourself with the obvious: the whole thing was inflated from the start. You were auditioning for a role that rarely gets cast.

    But the quieter rejections—the ones that occur under fluorescent lighting and polite conversation—cut deeper. They lack drama but not consequence. In fact, they feel more diagnostic, as if they’ve been administered by a committee.

    Consider friendship rejection. You meet someone, exchange a few promising signals, and then—nothing. Or worse, a friendship that once had momentum slows, then stalls, then disappears entirely. This is not a stranger declining your advances; this is someone who had enough data to make a decision and chose, calmly, not to proceed. The verdict feels less like bad luck and more like a character assessment.

    Then there is colleague rejection, which operates with corporate efficiency. Alliances form. Cliques crystallize. You are not invited into the warm circle of inside jokes and informal influence. You do your work—flawlessly, even—but without the buoyancy that comes from being wanted. You become competent but peripheral, visible but not included. This is where you begin to suspect you suffer from what might be called Social Capital Deficit Syndrome: a condition marked by a shortage of the invisible currency that makes social and professional life glide instead of grind.

    And here is the uncomfortable truth: social capital is not a luxury; it is infrastructure. Without it, you are left to interpret every silence, every omission, every polite deflection. The temptation is to diagnose yourself—too blunt, too quiet, too something—and then to launch a campaign of correction. This is where things get worse. Self-blame mutates into paranoia. Self-improvement becomes performance. You start sanding down your edges in public, hoping to emerge as a more acceptable version of yourself, and end up as a less convincing one.

    At some point, a harsher but cleaner realization presents itself: your personality comes with a certain gravitational pull, and not everyone will orbit it. No amount of forcing will change that. Trying to wedge yourself into every available opening only advertises the mismatch.

    The more durable response is less theatrical and more disciplined. Accept that people respond rather than decide. They are not conducting formal evaluations of your worth; they are reacting to chemistry, timing, and preference—most of which lie outside your control. This does not excuse cruelty, but it does eliminate the fantasy that everyone owes you affinity.

    So you take the higher road—not as a moral performance, but as a practical strategy. You remain courteous when ignored, steady when excluded, and restrained when slighted. You refuse to become the bitter man who proves his critics right simply by reacting exactly as expected.

    This runs counter to a culture that treats every problem as fixable with the right toolkit. You can, of course, pursue therapy, charisma workshops, confidence training—the whole catalog of self-upgrades. Some of it may help. Some of it may turn you into a louder version of the same problem. There is a fine line between improvement and overcorrection, and many people sprint past it.

    What remains, then, is a quieter ambition: to live without rancor. To accept your limits without turning them into grievances. To maintain a sense of integrity that does not depend on applause. The chip on your shoulder may feel like armor, but it is really a signal—confirmation to others that their instincts about you were correct. Let it go.

    You may lose the small comforts of self-pity. In return, you gain something sturdier: a life not governed by who did or did not choose you.

  • The Lazy Tax in the Kitchen

    The Lazy Tax in the Kitchen

    A few years ago, my wife and I attempted a moral upgrade. We bought top-tier stainless steel pans—clean, durable, virtuous. The kind of cookware that suggests you’ve finally grown up. What we actually got was a daily demonstration of failure. Meat clung to the surface like it had signed a lease. Eggs fused themselves into modern art. Dinner became a split decision: half of it made it to our plates, the other half calcified into a crust we had to chisel off like archaeologists of our own incompetence.

    So we retreated to ceramic nonstick—the promise of safety without the suffering. And to be fair, it worked. Eggs slid. Meat behaved. For a few months, we lived in a frictionless utopia. Then the decline began. The surface lost its glide, the cleanup grew less effortless, and by the one-year mark, the pan looked like it had survived a minor war. We replaced it. Then replaced it again. Three pans in three years. Smooth sailing followed by predictable decay.

    Now I’m floating a compromise: carbon steel for meat, ceramic reserved strictly for eggs. On paper, it’s elegant. Carbon steel rewards discipline—season it, preheat it, clean it promptly—and in return, it gives you something close to permanence. But the fine print matters. Acidic sauces erode seasoning. One careless move and the pan reverts to its old habits, clinging and punishing. You can follow the rules and still lose.

    If I’m honest, I suspect this experiment will end the same way the others did: with a sigh and another $100 ceramic pan purchased in quiet resignation. I’ve started calling this cycle the Lazy Tax—not because we’re lazy in the crude sense, but because we refuse to turn dinner into a technical exercise. We don’t want to manage a pan like it’s a piece of lab equipment. And yet the alternative is paying annually for convenience that quietly expires.

    That’s the real tension. If something feels like a job and only delivers a marginal improvement, you won’t sustain it. You default to ease. But ease has a cost. It trims your skill set, narrows your tolerance for friction, and charges your credit card for the privilege. In the end, you’re not just buying pans—you’re renting competence.

  • When Time Stops Asking and Starts Telling

    When Time Stops Asking and Starts Telling

    At sixty-four and four months, you thought you were still wading—water warm, footing reliable, the shoreline within easy reach. Then, without warning, the bottom vanished. One step of confidence, followed by that cold, immediate truth: you are no longer in control of the depth. The drop-off doesn’t negotiate. It doesn’t slope politely. It takes you.

    This particular plunge announced itself through something as mundane—and as revealing—as a watch. For decades, you wore mechanical divers with analog dials, small, intricate machines that whispered of heritage, craft, and a certain gentlemanly patience. That language no longer translated. You didn’t want poetry. You wanted coordinates.

    So you defected. You strapped on Tough Solar, Multiband-6 atomic G-Shocks—watches that don’t ask what time it feels like but what time it is, down to the second, corrected nightly by a signal from a tower you will never see. This was not a style change. It was an Atomic Conversion Event: the moment when nostalgia is exposed as a luxury item and precision becomes a survival tool. Time ceased to be something you admired. It became something you obeyed.

    You found yourself thinking of that Robinson family from Lost in Space—before stepping onto an alien surface, they consulted their robot, which scanned the air and issued a verdict: breathable or lethal. You needed your own robot now, but smaller, quieter, strapped to your wrist. Not to tell you whether the atmosphere would kill you, but whether you were wasting it.

    Because the deeper realization was not horological. It was existential. You no longer had the bandwidth for drift. “Fiddlefaddling,” once an acceptable pastime, now read like malpractice. Clarity was no longer optional; it was oxygen. You had to extract meaning from the noise and live with an alignment that would have bored your younger self. The watch change was merely the visible symptom of an internal regime shift.

    And this was not your first encounter with the abyss. A decade earlier, you performed a similar surgery on your life: you quit sports. Not gradually, not ceremoniously—just stopped. You recognized the structure for what it was: three-hour games followed by hours of commentary, followed by meta-commentary, followed by the analysis of the analysis. An infinite regress disguised as entertainment. You didn’t taper off. You cauterized the habit. Torch, not scalpel.

    That’s when the pattern revealed itself. Life is not a smooth shoreline; it is a series of drop-offs. Each one demands a new posture, a new set of tools, a new tolerance for truth. You don’t get to choose whether they arrive—only whether you adapt before you drown.

    There will be more. Of course there will. The only sensible response is not optimism but readiness.

    Buckle up.

  • The Night the Mechanical Diver Stayed Home

    The Night the Mechanical Diver Stayed Home

    Last night I escorted my family and in-laws to a breezy bistro in Redondo Beach, the kind of place where the ocean air does half the marketing. We sat on the patio while a two-man cover band—guitar and bass, faces cured by sun and time—worked their way through the canon: Gordon Lightfoot, Jim Croce, James Taylor. Their voices had the texture of driftwood. The songs arrived like postcards from a quieter century.

    For twenty years, a restaurant meant ceremony. I would strap on an expensive mechanical diver—the horological equivalent of cufflinks—and let it glint under low lighting as if I were auditioning for a role called “Man of Taste.” Five weeks ago, that instinct died without a funeral. In its place: a $110 Casio G-Shock GW-7900. No romance, no pretense, just a blunt instrument that tells time with the indifference of a wall clock. I wore it and felt, not diminished, but strangely settled. Our server had on a Casio Pro Trek. We exchanged a nod—the quiet recognition of two men who had defected from the same aesthetic regime.

    Two weeks ago, I sold off a pair of mechanical divers. The absence registered as silence, not loss. Five weeks ago, I bought my first Tough Solar, Multiband-6 G-Shock—a Casio G-Shock Frogman GWF-1000. It feels like I’ve owned it for a decade. Time has warped, stretched, lost its usual proportions. My working theory is this: when an obsession mutates—when it takes a hard, unexpected turn—the brain lingers over the wreckage and the new terrain at once. Every moment gets over-processed, as if your mind is trying to reconcile two incompatible identities. The result is temporal inflation. Five weeks feel like ten years. Meanwhile, the watches I once coveted sit in their box like artifacts from a civilization I can’t quite remember belonging to.

    I’ve made a few videos documenting this conversion. To my mild alarm, a handful of people have followed suit—buying the GW-7900, aiming their watches toward the Fort Collins signal tower like amateur astronomers chasing a frequency instead of a star. It’s absurd, and yet there it is: evidence that I may have drifted, however briefly, into the low orbit of influence. Not authority. Not expertise. Influence—the most accidental and least deserved of modern currencies.

    The question now hovers: is this a phase or a verdict? Will some future mood—call it nostalgia, call it vanity—dust me with longing and send me back to my mechanical divers? Or have I crossed a line I can’t uncross, sealed inside a G-Shock logic that values precision over poetry? I don’t know. The future, like the tide, refuses to take requests.

    What I do know is this: today I’ll need to punish myself with extra work in the garage gym. Last night I demolished a crispy chicken sandwich on brioche while listening to “If You Could Read My Mind,” and the song’s quiet sorrow did nothing to slow me down. If anything, it provided a soundtrack for excess—the softest possible music for a thoroughly unrestrained appetite.

  • Santa by Accident, Employee by Design

    Santa by Accident, Employee by Design

    Last night I dreamed that some friends and I staged an ad hoc production so slick it deserved syndication. I was invited to a holiday party—one of those corporate affairs where irony does the heavy lifting—and, on a whim, I put on the Santa suit for a toy company that merchandised its own cartoon characters. It was meant to be a joke–something to be forgotten along with the eggnog.

    Instead, someone filmed it.

    The footage went viral. What began as a throwaway bit turned into an annual television event with ratings that could humble prime time. Every December, there I was—jovial, booming, absurd—beamed into living rooms as if I had been engineered for it. Checks arrived with the regularity of a season: generous, unearned, almost accusatory. 

    To capitalize on the accident, the company staged yearly reunions—cast gatherings dressed up as nostalgia, broadcast to a nation that now insisted we mattered. More ratings. More money. More of me, whether I intended it or not.

    At first, I drifted into these reunions like a tourist in my own life—late, amused, faintly embarrassed. Then the terms clarified. I wasn’t a guest. I was talent. I wasn’t attending; I was reporting for duty. Somewhere in the fine print of success, I had become an employee of an entity I never remembered joining. The arrangement produced a tidy moral shrug: the checks fed my family, so what right did I have to object? Freedom had quietly converted itself into obligation, and the conversion rate was excellent.

    There was, however, a fracture line running through the whole enterprise. By sheer accident, I had chosen Santa—the apex role, the gravitational center. My friends had chosen elves: diligent, decorative, forgettable. Hierarchy, once introduced, does its work without permission. One friend stopped speaking to me altogether. When he finally did, it was not to reconcile but to issue a verdict. I was too thick to see what had happened to him, he said. Years of playing the lesser figure had hollowed him out. The easy talker was gone; in his place stood a sullen, rationed version of a man. We were no longer friends. I was no longer welcome to pretend otherwise.

    Others were kinder, even grateful. They insisted my Santa had ignited the whole spectacle—that without it, there would have been no show, no checks, no ritual of reunion. They thanked me as if I had designed the machine rather than stumbled into its engine.

    But gratitude doesn’t cancel damage; it merely coexists with it. The money was real. The applause was real. So was the loss. Watching a friend calcify into bitterness has a way of stripping glamour down to its wiring. Fame, even the accidental kind, doesn’t just elevate. It arranges people. It assigns altitude. And someone, inevitably, is left breathing thinner air.