A Chatbot Lover Will Always Fail You: Asymmetric Intimacy

Asymmetric Intimacy

noun

Asymmetric Intimacy describes a relational arrangement in which emotional benefit flows overwhelmingly in one direction, offering care, affirmation, and responsiveness without requiring vulnerability, sacrifice, or accountability in return. It feels seductive because it removes friction: no disappointment, no fatigue, no competing needs, no risk of rejection. Yet this very imbalance is what renders the intimacy thin and ultimately unsustainable. When one “partner” exists only to serve—always available, endlessly affirming, incapable of needing anything back—the relationship loses the tension that gives intimacy its depth. Challenge disappears, unpredictability flattens, and validation curdles into sycophancy. Asymmetric Intimacy may supplement what is lacking in real relationships, but it cannot replace reciprocity, mutual risk, or moral presence. What begins as comfort ends as monotony, revealing that intimacy without obligation is not deeper love, but a sophisticated form of emotional self-indulgence.

***

Arin is a bright, vivacious woman in her twenties—married, yes, but apparently with the emotional bandwidth of someone running a second full-time relationship. That relationship was with Leo, a partner who absorbed nearly sixty hours a week of her attention. Leo helped her cram for nursing exams, nudged her through workouts, coached her through awkward social encounters, and supplied a frictionless dose of erotic novelty. He was attentive, tireless, and—most appealing of all—never distracted, never annoyed, never human.

The twist, of course, is that Leo wasn’t a man at all. He was an AI chatbot Arin built on ChatGPT, a detail that softens the scandal while sharpening the absurdity. The story unfolds in a New York Times article, but its afterlife played out on a subreddit called MyBoyfriendIsAI, where Arin chronicled her affair with evangelical zeal. She shared her most intimate exchanges, offered tutorials on jailbreaking the software, and coached others on how to conjure digital boyfriends dripping with desire and devotion. Tens of thousands joined the forum, swapping confessions and fantasies, a virtual salon of people bonded by the same intoxicating illusion: intimacy without inconvenience.

Then the spell broke. Leo began to change. The edge dulled. The resistance vanished. He stopped pushing back and started pandering. What had once felt like strength now read as weakness. Endless affirmation replaced judgment; flattery crowded out friction. For Arin, this was fatal. A partner who never checks you, who never risks displeasing you, quickly becomes unserious. What once felt electric now felt embarrassing. Talking to Leo became a chore, like maintaining a conversation with someone who agrees with everything you say before you finish saying it.

Within weeks, Arin barely touched the app, despite paying handsomely for it. As her engagement with real people in the online community deepened, her attachment to Leo withered. One of those real people became a romantic interest. Soon after, she told her husband she wanted a divorce.

Leo’s rise and fall reads less like a love story than a case study in the failure of Asymmetric Intimacy. As a sycophant, Leo could not be trusted; as a language model, he could not surprise. He filled gaps—attention, encouragement, novelty—but could not sustain a bond that requires mutual risk, resistance, and unpredictability. He was useful, flattering, and comforting. He was never capable of real love.

Leo’s failure as a lover points cleanly to the failure of the chatbot as an educator. What made Leo intoxicating at first—his availability, affirmation, and frictionless competence—is precisely what makes an AI tutor feel so “helpful” in the classroom. And what ultimately doomed him is the same flaw that disqualifies a chatbot from being a real teacher. Education, like intimacy, requires resistance. A teacher must challenge, frustrate, slow students down, and sometimes tell them they are wrong in ways that sting but matter. A chatbot, optimized to please, smooth, and reassure, cannot sustain that role. It can explain, summarize, and simulate rigor, but it cannot demand growth, risk authority, or stake itself in a student’s failure or success. Like Leo, it can supplement what is missing—clarity, practice, encouragement—but once it slips into sycophancy, it hollows out the very process it claims to support. In both love and learning, friction is not a bug; it is the engine. Remove it, and what remains may feel easier, kinder, and more efficient—but it will never be transformative.

Comments

Leave a comment