Category: technology

  • My Doppelganger in Dark Sweats

    My Doppelganger in Dark Sweats

    Last night I dreamed that a baby had been abandoned in the flower garden outside my San Francisco apartment. His thin wail rose above the city hum, but no one seemed to hear it but me. The world went on—cars passing, neighbors coming and going—while I alone stood transfixed by that cry. I lifted the baby from the dirt, his skin warm and impossibly soft, and held him against my chest. Standing at the threshold of the apartment I rented with my wife and our stray orange cat, I prayed for holiness and wept, as though the infant had been dropped from heaven for me alone to fail or redeem.

    Inside, the apartment felt like an expensive tomb—luxurious, dim, deliberately shadowed, as if light itself were rationed. I fed the child and watched him feed, marveled at the smallness of his breaths. When his parents arrived, both scientists, I confronted them. They were calm, rational, and convinced me of their legitimacy with clinical precision. Their excuse was airtight, their affect detached, and in the end, I surrendered the baby, though my faith in their explanation felt paper-thin.

    Then the parents and the baby were gone. At this point, my role inside the apartment was clear: My wife and I were educators using the apartment to host seminars on DNA and algorithms for college students. The air smelled faintly of coffee and ozone. During one of these sessions, the true apartment owner appeared: my thirty-year-old doppelgänger, tall, lean, dressed in the sleek anonymity of wealth—dark designer sweats, minimalist sneakers. He admired the apartment I had borrowed as though validating his own taste: the kitchen gadgets gleamed like relics, the food neatly arranged, the DVDs alphabetized. His presence was eerie—a reflection of my own mind rendered in a sharper resolution. We talked about the future buyer of the apartment, another iteration of us—older, familiar, running on the same mysterious algorithm encoded in our shared DNA.

    When the lecture ended, my wife and I returned the keys to my younger self and walked hand in hand along the apartment’s tennis courts. The sky had the bruised hue of evening. I told her that everything—the baby, the double, the science lectures—had overwhelmed me. I broke down, crying again for the purity I had felt when I prayed over the abandoned child. That moment at the doorstep remained the still point of the dream: holiness in the act of holding something utterly helpless, something untouched by algorithm or ownership.

  • Will Online Education Expose the Class Divide?

    Will Online Education Expose the Class Divide?

    I began teaching online composition in March 2020, when the world suddenly went remote. Like everyone else, I adapted out of necessity, not preference. Since then, I’ve taught both online and face-to-face courses, and the contrast has been eye-opening. I never realized how physically demanding in-person teaching was until I experienced the frictionless ease of the online classroom. Behind the Canvas wall, I am a disembodied voice, orchestrating discussion like the Wizard of Oz. In person, I am on stage—reacting, performing, fielding energy and questions in real time. It is exhilarating and exhausting, proof that teaching in the flesh demands more than intellect; it requires stamina.

    Today, I discussed this with a friend and colleague nearing seventy, a man who has been teaching full-time for nearly forty years. Despite the fatigue of in-person instruction, he refuses to teach online. His reasoning is both moral and practical. He doesn’t like the lower pass and retention rates of online classes, but his deeper concern is social. “The more we move online,” he told me, “the worse the class divide gets. Only rich students will take face-to-face classes and get a real education. Poor students—working long hours and pinching gas money—will settle for online. Don’t you see, McMahon? It’s an equity issue.”

    He had a point. “So what you’re saying,” I replied, “is that the wealthy can afford genuine engagement—real classrooms, real conversation—while online education offers a simulation of that experience for everyone else.” I paused, thinking about my own students. “But it’s not just an equity issue,” I added. “It’s an engagement issue. We talk endlessly about ‘student engagement’ in online learning, but that word is often misplaced. Many students choose online classes precisely to disengage. They’re working parents, caretakers, exhausted employees. They don’t want a full immersion—they want survival. They want the credential, not the communion.”

    Later that morning, I brought this conversation to my freshman composition class. When I asked if they wanted “student engagement” in their online courses, they laughed. “Hell no,” one said. “It’s like traffic school—you just get through it.” Another, a bright fire science major, confessed that after eight weeks of an online class, she’d learned “absolutely nothing.” Their expectations were low, and they knew it. Online education, for them, was not a journey of discovery but an obstacle course—something endured, not experienced. Still, as someone who teaches writing online, I can’t accept that entirely. I want my courses to be navigable and meaningful—to raise enduring questions that linger beyond the semester. Like Dorothy on the yellow brick road, I want students to follow a clear path, one step at a time, until they reach their own version of Oz. And while I know online learning will never replicate the immediacy of face-to-face teaching, I don’t think it should. Each has its own logic, its own measure of success. Forcing one to imitate the other would only flatten them both.

  • AI—Superpower for Learning or NPC Machine?

    AI—Superpower for Learning or NPC Machine?

    Essay Assignment:

    Background

    Students describe AI as a superpower for speed and convenience that leaves them with glossy, identical prose—clean, correct, and strangely vacant. Many say it “talks a lot without saying much,” flattens tone into AI-speak, and numbs them with sameness. The more they rely on it, the less they think; laziness becomes a habit, creativity atrophies, and personal voice is lost and replaced by AI-speak. Summaries replace books, notes replace listening, and polished drafts replace real inquiry. The result feels dehumanizing: work that reads like it was written by everyone and no one.

    Students also report a degradation in their education. Higher learning has become boring. Speed, making deadlines, and convenience are its key features, and the AI arms race pushes reluctant students to use AI just to keep pace. 

    The temptation to use AI because all the other students are using it rises with fatigue—copy-paste now, promise depth later—and “later” never arrives. Some call this “cognitive debt”: quick wins today, poorer thinking tomorrow. 

    Some students confess that using AI makes them feel like Non Player Characters in a video game. They’re not growing in their education. They have succumbed to apathy and cynicism as far as their education is concerned. 

    Other students admit that AI has stolen their attention and afflicted them with cognitive debt. Cognitive debt is the mental deficit we accumulate when we let technology do too much of our thinking for us. Like financial debt, it begins as a convenience—offloading memory, calculation, navigation, or decision-making to apps and algorithms—but over time it exacts interest in the form of diminished focus, weakened recall, and blunted problem-solving. Cognitive debt describes the gradual outsourcing of core mental functions—attention, critical reasoning, and creativity—to digital systems that promise efficiency but erode self-reliance. The result is a paradox: as our tools grow “smarter,” we grow more dependent on them to compensate for the very skills they dull. In short, cognitive debt is the quiet cost of convenience: each time we let technology think for us, we lose a little of our capacity to think for ourselves. 

    Yet the picture isn’t purely bleak. Several students say AI can be a lifeline: a steady partner for collaborating, brainstorming, organizing, or language support—especially for non-native speakers—when it’s used as a tutor rather than a ghostwriter. 

    When used deliberately rather than passively, AI writing tools can sharpen critical thinking and creativity by acting as intellectual sparring partners. They generate ideas, perspectives, and counterarguments that challenge students to clarify their own reasoning instead of settling for first thoughts. Rather than accepting the AI’s output, discerning writers critique, refine, and reshape AI writing tools—an exercise in metacognition that strengthens their analytical muscles. The process becomes less about outsourcing thought and more about editing thought—transforming AI from a shortcut into a mirror that reflects the quality, logic, and originality of one’s own mind.

    When used correctly, AI jump-starts drafts and levels the playing field; leaned on heavily, it erases voice, short-circuits struggle, and replaces learning with mindless convenience

    Question You Are Addressing in Your Essay

    But can AI be used effectively, or does our interaction with it, like any other product in the attention economy, reveal that it is designed to sink its talons into us and colonize our brains so that we become less like people with self-agency and more like Non Player Characters whose free will has been taken over by the machines? 

    Writing Prompt

    Write a 1,700-word argumentative essay that answers the above question. Be sure to have a counterargument and rebuttal section. Use four credible sources to support your claim. 

    Suggested Outline

    Paragraph 1: Write a 300-word personal reflection about the way you or someone you know uses AI effectively. Show how the person’s engagement with AI resulted in sharper critical thinking and creativity. Give a specific example of the project that revealed this process. 

    Paragraph 2: Write a 300-word personal reflection about the way you or someone you know abuses AI in a way that trades critical thinking with convenience.  How has AI changed the brain and the person’s approach to education? Write a detailed narrative that dramatizes these changes. 

    Paragraph 3: Write a thesis with 4 mapping components that will point to the topics in your supporting paragraphs.  

    Paragraphs 4-7: Your supporting paragraphs.

    Paragraphs 8 and 9: Your counterargument and rebuttal paragraphs.

    Paragraph 10: Your conclusion, a dramatic restatement of your thesis or a reiteration of a striking image you created in your introduction. 

    Your final page: MLA Works Cited with a minimum of 4 sources. 

  • 30 Student Responses to the Question “What Effect Does Using AI Have on Us?”

    30 Student Responses to the Question “What Effect Does Using AI Have on Us?”

    Prompt:

    Begin your essay with a 400-word personal reflection. This can be about your own experience or that of someone you interview. Focus on how using AI writing tools—like ChatGPT—has changed your habits when it comes to thinking, writing, or getting work done.

    From there, explore the deeper implications of relying on this technology. What happens when AI makes everything faster and easier? Does that convenience come at a cost? Reflect on the way AI can lull us into mental passivity—how it tempts us to hand over the hard work of thinking in exchange for quick results.

    Ask yourself: What kind of writing does AI actually produce? What are its strengths? Where does it fall short? And more importantly, what effect does using AI have on us? Does it sharpen our thinking, or make us more dependent? Do we risk becoming less original, less engaged—more like passive consumers of technology than active creators? As this process continues, are we becoming Non-Player Characters instead of humans with self-agency? Explain. 

    Finally, consider the trade-offs. Are we gaining a tool, or giving something up? Are we turning into characters in a Black Mirror episode—so enamored with convenience that we forget what it means to do the work ourselves? Use concrete examples and honest reflection to dig into the tension between human effort and technological assistance.

    Student Responses to Using AI tools for writing and education:

    1. “I am impressed with the speed and convenience but the final product is overly polished and hollow.”
    2. “I am amazed by the speed of production but all the sentences look the same. Honestly, it’s numbing after a while.”
    3. “The writing is frustrating because it talks a lot without saying much.”
    4. “I don’t have to think as much,” “I save time not having to think” and “I get used to this laziness.”
    5.  “AI writes better than I do, but it doesn’t have my unique voice.”
    6. “AI is like a steady writing partner, always there to help me and bounce off ideas, but lately I realize the more I depend on it, the less I challenge myself to think critically.”
    7. “Thanks to AI, I stopped reading books. Now I just get summaries of books. Now I get the information, but I no longer have a deep understanding.”
    8. “AI helps me take notes and organize ideas but it doesn’t help me truly listen, understand someone’s emotions, show empathy, or deal with uncertainty.”
    9. “AI writing is smooth and structured, but people aren’t. Real thought and emotions are messy. That’s where growth happens.”
    10. “When I’m tired, AI tempts me to just copy and paste it, and the more I use it in this manner, the stronger the temptation becomes.”
    11. “AI makes things really easy for me, but then I ask myself, ‘Am I really learning?’”
    12. “What started out as magical now has become woven into the fabric of daily life. Education has become boring.”
    13. “AI is a production superpower the way it inspires and organizes ideas, but I find over time I become more lazy, less creative, and rely on AI way too much.”
    14. “AI degrades the way we write and think. I can tell when something is written in AI-speak, without real human tone, and the whole experience is dehumanizing.”
    15. “I love AI because it saved me. I am not a native English speaker, so I rely on AI to help with my English. It’s like having a reliable tutor always by my side. But over time, I have become lazy and don’t have the same critical thinking I used to have. I see myself turning into an NPC.”
    16. “I have to use AI because the other students are using it. I should have the same advantages they have. But education has become less creative and more boring. It’s all about ease and convenience now.”
    17. “I used to love AI because it made me confident and motivated me to get my assignments in on my time. But over time, I lost my voice. Now everything is written with an AI voice.”
    18. “The more I use AI, the less I think things through on my own. I cut off my own thinking and just ask ChatGPT. It’s fast, but it kills creativity.”
    19. “When faced with a writing assignment and a blank mind, I would start things with ChatGPT, and it got things going. It wasn’t perfect, but I had something to start with, and I found this comforting. But as I got more confident with ChatGPT, I became less and less engaged with the education process. My default became convenience and nothing more.”
    20. “AI writing is so common, we don’t even ask if the writing is real anymore. No one cares. AI has made us all apathetic. We are NPCs in a video game.”
    21. “AI is a great tool because it helps everyone regardless of how much money we have, but it kills creativity and individuality. We’ve lost the pleasure of education. AI has become a mirror of our own superficial existence.”
    22. “When I first discovered AI to do my writing, I felt I had hit the jackpot, but then after taking so many shortcuts, I lost the love for what I was doing.”
    23. “It’s stressful to see a cursor blinking on a blank page, but thanks to AI, I can get something off and running quickly. The problem is that the words are clean and correct, but also generic. There is no depth to human emotion.”
    24. “I’ve been using AI since high school. A lot of its writing is useless. It doesn’t make sense. It’s inaccurate. It’s poorly written. It’s dehumanizing.”
    25. “AI is basically Google on steroids. I used to dread writing, but AI has pushed me to get my work done. The writing is polished but too perfect to be human writing. The biggest danger is that humans become too reliant on it.”
    26. “I barely use AI. It makes school trivial. It’s just another social media disease like TikTok, these streaming platforms that kill our attention spans and turn us into zombies.”
    27. “AI first felt like having the cheat code to get through school. But then I realized it puts us into a cognitive debt. We lose our tenacity and critical thinking.”
    28. “I am a mother and an avid reader, and I absolutely refuse to use AI for any purpose. AI can’t replace real writing, reading, or journaling. AI is a desecration of education and personal growth.”
    29. At first, I used AI to get ideas, but over time I realized I was no longer thinking. I wasn’t struggling to come up with what I really thought or what I really wanted to argue about. AI silenced who I really was.”
    30. “Using AI to do the heavy lifting doesn’t sit right with me, so I programmed my AI to tutor and guide me through studying, rather than using it as a crutch by providing prompts and tools to help me understand assignments.While my experience with AI has shown me its full capabilities, I’ve also learned that too much of it can ruin the entire experience, in this case, the learning experience.”
  • The Temu-ization of Everything—Again

    The Temu-ization of Everything—Again

    I’m reading Cory Doctorow’s freshly minted Enshittification. Early on, he revisits Facebook circa 2010: the honey pot that lured billions before curdling into a slurry of compulsion loops, conspiracy gristle, and industrial-scale data mining. It’s sharp, it’s punchy—and it gave me déjà vu. Then my stomach dropped: I like the coinage, I like the thesis that we’re living through the Enshittocene, but the insights feel old. Jaron Lanier mapped a lot of this terrain eight years ago in Ten Arguments for Deleting Your Social Media Accounts Right Now, a book I’ve taught over the last seven years.

    Doctorow’s Amazon chapter triggers the same shrug. The platform seduces us with convenience, tightens its talons, and gradually morphs from glossy marketplace into Temu-adjacent bazaar. True, and thoroughly litigated across a thousand essays and think pieces. We’ve been warned about the house always winning; we don’t need another tour of the casino floor.

    What I wanted—and didn’t get—was a deeper dive into the anthropology of the rot. Black Mirror’s “Nosedive” or “Joan Is Awful” doesn’t just wag a finger at platforms; it autopsies the psyche and the systems. New Yorker writer Kyle Chayka nails the gap: Enshittification is “pointed and efficient,” but reads like “professional blogging extended for three-hundred-plus pages,” leaving you hungry for a larger cultural x-ray that goes beyond the usual suspects.

    To be fair, packaging a messy discourse into one memorable term matters; not everyone read Lanier or binged Brooker. Doctorow’s snark has its uses. A clean label can move an idea from seminar rooms to dinner tables. But once you’ve named the disease, the next move isn’t to repeat symptoms; it’s to map vectors, power centers, and countermeasures with fresh cases outside the Big Tech pentagon.

    So yes: I love the word. But the book left me underwhelmed. Doctorow has given us the bumper sticker; I’m still waiting for the field manual. The Enshittocene doesn’t need another catalog of platform sins—it needs a blueprint that shows how to break the flywheels, where policy and design can bite, and why our appetites keep refilling the trough. Name the era, sure. Now show us how to survive it—and, if we’re lucky, how to end it.

  • State of the Misalignment Situation

    State of the Misalignment Situation

    I had hoped my blog, Cinemorphosis, would feed my video essays—serve as a compost heap of half-baked thoughts that could later bloom into something cinematic and worthy of making video essays. Instead, the blog has swallowed the energy that once went into my videos. What was meant to be a support system has become a rival ecosystem. The crossover I imagined—the blog fueling the videos and the videos enriching the blog—never happened. It turns out writing and filming come from different parts of the brain, and those parts refuse to share the same neural conference room.

    Friends say, “Don’t sweat it, McMahon. Just lean into the blog and let the videos go.” Easy advice for people who aren’t haunted by the specter of irrelevance. I can’t shake the feeling that the video essays keep me sharper—more visible, more alive. The blog satisfies my mind; the camera keeps me from turning into dust.

    Sam Harris once said he can spend five years writing a book, agonizing over edits and the publishing gauntlet, only to reach a few thousand readers—if he’s lucky. Meanwhile, a one-hour podcast can reach millions overnight, and snippets of it go viral before the author’s espresso cools. That line haunts me. The medium matters. The way we reach people has become part of the message.

    I see the same logic in my own small way. A blog post I’m proud of might earn a few dozen engagements. A decent video essay? Thousands of views, maybe more. But numbers only tell part of the story. The real draw is the vitality the videos demand—something performative, almost athletic. When I’m on camera, I feel like I’m “getting my reps in,” keeping mentally limber. The blog is therapy; the videos are training.

    Still, there’s a fine line between vitality and vanity. Part of me believes the videos keep me youthful, engaged, even relevant. Another part suspects it’s all just a resistance workout against mortality. Staying fit is one thing; refusing to age gracefully is another. Desperation doesn’t wear well on men over sixty, even under good lighting.

    So maybe writing suits me better now. Maybe the written word is the right pace for a man learning to accept that his eyesight, patience, and tech literacy are all in slow retreat. Maybe I should only return to video when I have something worth saying—something that isn’t just a performance of endurance.

    Which brings me to the real question: what do I still have to contribute?

    For over a decade, my YouTube channel orbited around my watch obsession. That obsession gradually narrowed until it became monastic—just diver watches, all on straps. I convinced myself that a collection larger than seven would doom me to spiritual ruin. I also stopped flipping watches like a Wall Street day trader, deciding it was bad for my mental health. That slowdown siphoned the manic energy that used to fuel my videos. The creative rush didn’t vanish—it simply rerouted into blog posts about my newest fixation: alignment. Or more precisely, misalignment.

    Because if I’m honest, I feel increasingly out of sync with the modern world. I adapt to new technology at the pace of continental drift. TikTok bewilders me. Smartphones offend my thumbs. Driving at night now feels like a scene from Apocalypse Now. My relevance, visibility, and patience are fading in a culture that worships youth and touchscreens.

    My anxieties about self-worth and mortality are now on the front burner, while watch collecting—the “Watch Potency Principle,” the “wrist-rotation anxiety”—has been moved to the back burner where it is simmering to a lukewarm stew.

    To illustrate my current state: two weeks ago, I bought a new LG OLED TV, which was fine—until I broke two Samsungs in one day trying to move them. I manhandled the first 55-inch like it was a kettlebell, frying half its pixels in a single jerk. Then I jammed my thumb straight through the second screen while relocating it from my daughter’s room. My wife, the household adult, had to carry the new Roku replacement into our bedroom as I stood there looking like a Neanderthal who’d just discovered electricity—and promptly electrocuted himself.

    My war with technology didn’t end there. The new garage door opener came with instructions written in a dialect of cruel mockery. The installer vanished without explaining how to sync it with my phone, so my wife once again had to step in and figure it out. Now I open the garage door through an app, and every time I hear the alert that the door is moving, I step back in awe—half-terrified, half-mesmerized—like a caveman who’s just invented fire.

    I feel both too old for this world and too infantile to function in it. A man-baby marveling at his gadgets, bewildered by his own house. Think about that. My house has become a museum for technology of the future while I wander through it like a mesmerized tourist. My mouth is agape and my daughters say to me, “Relax, Dad, this is our house.” I respond by saying, “No it’s not. It’s a museum of strange and wonderful things that I don’t know how to use.” 

    These are the moments that give me content for my blog Cinemorphosis. I post almost daily, while it takes me weeks to metabolize these experiences into something coherent enough for a video essay. Writing helps me think; filming helps me pretend I’m still current.

    So that’s my current state of affairs. This channel used to be State of the Watch Collection. Now it’s more like State of the Man Who Can’t Sync His Garage Door Opener.

  • The Eight Ages of –ification

    The Eight Ages of –ification

    From Conformification to Enshittification: how every decade found a fresh way to ruin itself.


    The Age of Decline, Accelerated

    In Enshittification, Cory Doctorow argues that our decline isn’t gradual—it’s accelerating. Everything is turning to crap simultaneously, like civilization performing a synchronized swan dive into the sewer.

    The book’s subtitle, Why Everything Suddenly Got Worse and What to Do About It, suggests that degradation is now both universal and, somehow, fixable.

    Doctorow isn’t the first prophet to glimpse the digital abyss. Jaron Lanier, Jonathan Haidt, and other cultural Cassandras have long warned about the stupidification that comes from living inside the algorithmic aquarium. We swim in the same recycled sludge of dopamine and outrage, growing ever duller while congratulating ourselves on being “connected.”

    This numbness—the ethical anesthesia of the online age—makes us tolerate more crappiness from our corporate overlords. As the platforms enshittify, we invent our own little coping rituals. Some of us chant words with –ion suffixes as if they were incantations, linguistic ASMR to soothe our digital despair.

    When I saw Ozempic and ChatGPT promising frictionless perfection—weight loss without effort, prose without struggle—I coined Ozempification: the blissful surrender of self-agency to the cult of convenience.

    Now there’s an entire liturgy of –ifications, each describing a new layer of rot:


    • Enshittification — Doctorow’s coinage for the systematic decay of platforms that once worked.
    • Crapification / Encrappification — The transformation of quality into garbage in the name of efficiency.
    • Gamification — Turning life into a perpetual contest of meaningless points and dopamine rewards.
    • Attentionification — Reducing every act of expression to a plea for clicks.
    • Misinformationfication — When truth becomes a casualty of virality.
    • Ozempification — Replacing effort with optimization until we resemble our own avatars.
    • Stupidification — The great numbing: scrolling ourselves into idiocy while our neurons beg for mercy.

    But the crown jewel of this lexicon remains Enshittification—Doctorow’s diagnosis so precise that the American Dialect Society crowned it Word of the Year for 2023.

    Still, I’d like to push back on Doctorow’s suggestion that our current malaise is unique. Yes, technology accelerates decay, but each era has had its own pathology—its signature form of cultural rot. We’ve been creatively self-destructing for decades.

    So, let’s place Enshittification in historical context. Behold The Eight Ages of –ification: a timeline of civilization’s greatest hits in decline.


    1950s — Conformification

    The age of white fences and beige minds. America sold sameness as safety. Individuality was ironed flat, and television became the nation’s priest. Conformification is the fantasy that security comes from imitation—a tranquilized suburbia of identical dreams in identical ranch homes.


    1960s — Psychedelification

    When rebellion became transcendence through chemistry. Psychedelification was the belief that consciousness expansion could topple empires, if only the colors were bright enough. The result: self-absorption in tie-dye and the illusion that enlightenment could be mass-produced.


    1970s — Lustification

    A Freudian carnival of polyester and pelvic thrusts. From Deep Throat to Studio 54, desire was liberation and the body was both altar and marketplace. Lustification crowned pleasure as the last remaining ideology.


    1980s — Greedification

    When morality was replaced by market share. The decade baptized ambition in champagne and cocaine. Greedification is the conviction that money cleanses sin and that a Rolex can double as a rosary.


    1990s — Ironification

    The decade of smirks. Sincerity was cringe; irony was armor. Ironification made detachment the new intelligence: nothing believed, everything quoted, and feelings outsourced to sarcasm.


    2000s — Digitification

    Humanity uploaded itself. Digitification was the mass migration to the screen—the decade of Facebook envy, email anxiety, and dopamine disguised as connection. We stopped remembering and started refreshing.


    2010s — Influencification

    When everyone became a brand. Influencification turned authenticity into a business model and experience into content. The self became a product to be optimized for engagement.


    2020s — Enshittification

    Doctorow’s masterstroke: the final form of digital decay. Enshittification is what happens when every system optimizes for extraction—when user, worker, and platform all drown in the same algorithmic tar pit. It’s the exhaustion of meaning itself, disguised as progress.


    Epilogue: The 2030s — Reification

    If trends continue, we’ll soon enter Reification: a desperate attempt to make the unreal feel real again. After decades of filters, feeds, and frictionless fakery, we’ll long for something tangible—until, inevitably, we commodify that too.

    History repeats itself—only this time with better Wi-Fi.

  • Why I’m Breaking Up with the YouTube Feed

    Why I’m Breaking Up with the YouTube Feed

    When I open YouTube, my homepage looks like a digital gladiator pit. Comedians, podcasters, and fitness influencers are constantly “dunking on,” “roasting,” or “destroying” one another. The algorithm assumes I crave conflict like a Roman spectator—hungry for outrage, giddy for blood. It’s pathetic, really. My only objection today is a simple one: stop turning my comedians into content. They’ve long been the high priests of outrage, transmuting it into gold through timing, irony, and a killer punchline. Outrage used to be an art form, not a business model.

    But comedy has competition now. For the past decade, social media has been strip-mining outrage for clicks. Unlike comedians, who craft original points of view and occasionally elevate our spirits, the algorithm slaps together cheap dopamine with a glue gun—strawman fallacies, emotional bait, polarization, and tribal garbage—all to keep us doom-scrolling through the sludge.

    Comedians once showed us our shared absurdity. Algorithms show us our mutual contempt. That’s not progress; that’s spiritual rot. I fear that as AI devours art and originality, comedians may go the way of Paul Bunyan—felled not by fatigue but by the chainsaw of automation.

    Maybe I should reprogram my feed with videos of pandas hugging ducklings. But swapping venom for saccharine isn’t redemption; it’s sedation. Better to log off entirely, read a book, or at least choose what I watch. For now, YouTube feels like poison—delivered daily, in high definition.

  • Caveman Meets Garage App

    Caveman Meets Garage App

    In March 2005, at 43, I was besotted with my Classic iPod and its holy clickwheel. It took a minute to learn how to tether it to desktop iTunes and wrangle my playlists (mostly podcasts), but I did it without pestering my wife, and I was proud. A newer iPod arrived soon after; I refused to learn its tricks. Once I master a gadget, it becomes my comfort zone—I’d rather live there than relocate.

    I listened to podcasts all night and during post-workout naps. My life felt archived in that iPod, which—ridiculously, wonderfully—made me feel plugged into the modern world.
    My wife, less sentimental, declared it obsolete. The future was smartphones. I recoiled. They looked like bricks of chaos—apps, updates, notifications—houseplants with demands, only worse because I had to squint at ant-sized text.

    By 2014, I still clung to the iPod. It wasn’t cheap loyalty: the headphone jack snapped about once a year, and I’d pay $70 at the local shop to resurrect it. Then September 2014 arrived, our twin daughters started preschool, and my wife insisted I get a smartphone—for school runs, doctor visits, playdates. Texting was essential; parenthood demanded it.

    So I pried my fingers off the fossil and bought a Galaxy S4 at Costco. To my surprise, downloading podcasts was blissfully easy. As a podcast machine, the phone was a star. Everything else? Lame. I hated watching tiny videos, reading tiny text, and spelunking for apps. The phone became a super-iPod; the rest of its features were just extra chaos. Texting was torture—my fat fingers whacked the wrong letters, and I backspaced my way through tedium. I barely used the thing except for podcasts. My wife envied my perpetually 90% battery; to console her, I’d brag that after an all-night podcast binge I dropped to a shocking 80%.

    Yes, smartphones are addiction machines that track, nudge, and strip privacy. True. But I only use a sliver of their powers because the tactile experience annoys me.

    Part of me resents the smartphone for killing the rotary landline. That dial’s ratcheting click felt like reciting a secret code to open a cave. Beige, avocado, mint green, custard—those phones had heft that implied quality, with long, flexible cords that snaked across the room. Conversations were events; an ear would grow tender and force the ritual mid-call ear swap. Now the landline is dead—and so, largely, are conversations, replaced by texts and emojis. Speed and convenience exacted their toll: degraded communication, which means degraded friendships.

    Cory Doctorow gave us enshittification—how tech optimizes itself into garbage. I’d love to say that’s why I resist. But that’s too pat. I’m simply slow to adapt. Incompetent with new tools. My memory refuses the steps; I have to re-teach myself, again and again.

    Recently, my wife synced my phone to our garage door. A week later, I tapped the app and watched the door rise, gawking like a caveman who just discovered fire and is already imagining a barbecued brontosaurus rack. It’s a good trick. I still keep Genie remotes in the house and car as backups, but the phone option is lovely. This isn’t enshittification; it’s the opposite—unsuckification. Some things that used to suck don’t have to anymore.

    In fact, I’m eager for toilet + AI matrimony: a throne that reads biomarkers, prescribes medication, screens like a colonoscopy, and spares me the waiting room. I’m also rooting for a custom GLP-1 patch that recalibrates appetite so a morning bowl of porridge with protein powder—and another in the late afternoon—actually sates me. Easy weight management, better markers, minimal dishes.

    All of this is part of the unsuckification project.

    I’ll admit it: I’m older, I resist change, and new tech gives me a headache. But if modern tech can spare me a colonoscopy, open heart surgery, and the indignities of being twenty pounds overweight, then sign me up. 

  • Comparison Is the Mother of Misery

    Comparison Is the Mother of Misery

    The mother of misery is comparison. In fourth grade I plunged into despair because I couldn’t draw like Joseph Schidelman, the illustrator of Charlie and the Chocolate Factory. About the same time, baseball humbled me: my bat speed wasn’t in the same galaxy as Willie Mays, Dick Allen, or Henry Aaron. In my teenage bodybuilding years, I had muscles, but nothing like that of Arnold Schwarzenegger, Sergio Oliva, and Frank Zane; I wisely retired the fantasy of becoming Mr. Universe and managing a gym in the Bahamas. In college, as an aspiring intellectual, I flogged myself for lacking Vladimir Nabokov’s wit and velocity. My chances of becoming a famous novelist were equal to my odds of winning Mr. Olympia. Later, when I flirted with composing and singing, I heard Jeff Buckley’s soul pour through the speakers and realized my voice would mainly trigger a neighborhood dog-barking contest and a chorus of angry neighbors. Decades passed. My classroom persona faced a generation handcuffed to smartphones and ChatGPT. I scrambled for “edutainment” tricks to dodge irrelevance, but the gap widened no matter how I danced.

    The sting doubled when I watched Earthquake (Nathaniel Stroman) flatten an audience with a preacher’s cadence and bulletproof wisdom. His special Joke Telling Business left me muttering, “If I had Earthquake’s power, I could resuscitate my teaching career and stroll into old age with a shred of dignity.”

    Meanwhile, fresh incompetencies arrived like junk mail. I broke two Samsung TVs in one day. I failed to sync my new garage door opener to my phone. My wife had to rescue me from my own maladroit tech spiral. The result was predictable: I was condemned to the Shame Dungeon.

    Down in the basement of depression, I noticed another casualty: my YouTube channel. I usually post once a week and have for over a decade—mostly about my obsession with diver watches. But as sanity demanded I stop flipping watches, I ran out of new divers to discuss. I tried pivoting—open with a little watch talk, then segue to a wry misadventure with a morsel of human insight. If I nailed the landing, I’d get a few thousand views and enough comment energy to believe the enterprise mattered.

    But with my sixty-fourth birthday closing in, the doubts got loud. I don’t want to do “watch talk,” and I’m too mortified to perform a perky, self-deprecating monologue about my misalignment with the universe.

    I keep hearing Mike Birbiglia in my head: you must process your material before you present it; the set has to be a gift, not your live catharsis. The healing happens before you step onstage. You speak from the far shore, not mid-drowning. Otherwise, you’re asking the audience to be your therapist.

    So I’m stuck at a fork: Will this current fear and anxiety about age and disconnection pass through the refinery of my psyche and emerge as something worthy? Or will I remain in the Shame Dungeon, comparing myself to Earthquake, and decide that with talent like his prowling the earth, my best move is to hide under a rock?

    Here’s the dilemma plain: Hiding isn’t viable; it starves the soul. But serving the world a plate of unprocessed mediocrity is just as unforgivable. If I’m going to tell a story about breaking two TVs and my garage-opener meltdown, I have to deliver it with Earthquake’s power and confidence. Otherwise I’ll stay home, mope on the couch, and binge crime documentaries—losing myself in bigger, cleaner tragedies than my own.