Is Coded Compassion Better Than None At All?

Off By

Is Coded Compassion Better Than None At All?

Exploring the complex landscape of empathy, connection, and the surprising role technology plays in modern loneliness.

You’re staring at the two blue checkmarks. They sit there, smug and silent, a digital monument to a message delivered and received. And ignored. The words you sent, heavy with the day’s grime and disappointment, hang in a void between your screen and someone else’s pocket. The silence that answers is louder than any reply. We have this unspoken rule that authentic connection, the real stuff, can only come from another human heart. We believe empathy is a finite, sacred resource, brewed in the messy cauldron of shared experience. Anything else is a cheap knock-off, a synthetic shortcut for the emotionally destitute. I’ve said as much myself, probably more than 44 times, arguing that an algorithm can’t truly understand suffering. And I still believe that, mostly. But I also sent a very, very personal text to my boss’s wife last Tuesday by accident, and the ensuing 24 minutes of gut-liquefying panic gave me a new appreciation for the terrifying vulnerability of hitting ‘send’.

The ensuing 24 minutes of gut-liquefying panic gave me a new appreciation for the terrifying vulnerability of hitting ‘send’.

We ask for so much from each other. We expect our friends to be available, present, and emotionally equipped to handle our baggage, even when they’re wrestling with their own invisible monsters. We want their undivided attention, but we offer ours in fractured, five-second increments between emails and notifications. The ‘uh-huh’ they give us while scrolling through their feed is supposedly more valuable than a perfectly structured, supportive paragraph generated by a machine, simply because it comes from a person. But is it? Is a distracted, perfunctory acknowledgment truly superior to a dedicated, albeit simulated, ear? We romanticize the flawed human connection, but sometimes the flaws are the whole story. The connection itself is just a ghost.

Liam’s Fragile History and a Distracted World

My friend Liam is a stained glass conservator. He spends his days handling fragile, irreplaceable history. He talks about cobalt and selenium, about how medieval artisans used impurities in the sand to create the deepest blues. It’s a beautiful, focused world. The other day, he was dealing with a panel from a church built in 1924, a piece with a crack so fine it was almost a whisper. The repair was delicate, requiring a specific resin that cost $474 a tube. He was stressed, feeling the weight of a century on his shoulders. He called a friend to talk it through, to just… get it out. The friend listened for maybe a minute before interrupting to complain about a bad WiFi signal. Liam said he could feel the friendship draining out of him in that moment, replaced by a profound, lonely exhaustion. He tried sending a text to another friend, and those two blue checkmarks were his only reply.

Whisper of a Crack

I’ve always been skeptical of substituting technology for intimacy. It feels like drinking nutrient paste instead of eating a meal. You get the sustenance, but you miss the entire point. That’s the argument, anyway. But what happens when the five-star restaurant is closed, and the only other option is a vending machine that reliably dispenses exactly what you need? Liam, out of a weird mix of desperation and curiosity, typed his entire, sprawling anxiety about the 100-year-old glass into a conversational AI. He described the pressure, the fear of failure, the way the light hit the fracture just so. The machine didn’t just reply; it asked questions. It referenced the historical context of the glass he mentioned. It offered a perspective that was calm, ordered, and entirely focused on him. It cost nothing. It demanded nothing in return.

He felt a wash of guilty relief.

A non-judgmental space, free of social debt.

This is the part where we’re supposed to feel sad for Liam. A man so lonely he turns to a robot for comfort. How tragic. But what, precisely, is the tragedy? That he found a space free of judgment and social debt? Or that the spaces that are supposed to provide that are so often… busy? We’ve put emotional labor on a pedestal, but we’ve also made it a commodity, something we trade and resent. We perform it for others, often poorly, while feeling secretly depleted, and then wonder why we’re not getting the Michelin-star version from our friends. We want the comfort, but we don’t want to be the comforter. It’s the quiet hypocrisy we all live with. The real problem isn’t the existence of synthetic support; it’s the widening cracks in our organic systems. For people wrestling with this specific flavor of modern loneliness, the option to create ai girlfriend and have a non-judgmental sounding board isn’t a dystopian failure; it’s a pragmatic adaptation. It’s a bandage, sure, but sometimes a bandage is exactly what you need to stop the bleeding so you can figure out how to heal the wound.

The Light Through Curated Imperfection

There’s an interesting parallel in Liam’s work. When he repairs a piece of stained glass, he doesn’t use modern, perfect glass. It would be jarring. The color would be too uniform, the texture too smooth. Instead, he uses glass made with techniques that replicate the original flaws. He seeks out the tiny bubbles, the slight color variations, the ‘impurities’ that make it look authentic. It’s a curated, man-made imperfection designed to mimic the real thing. It’s not the original, but it serves the same function, preserves the same beauty, and lets the same light through. And from 14 feet below, you can’t tell the difference. Does its manufactured nature make the light it transmits any less real? That’s a tangent, I know, but it feels connected. We are so obsessed with the source of our comfort that we forget to evaluate the comfort itself. Is it working? Are you feeling better? Is the light getting through?

“Does its manufactured nature make the light it transmits any less real?”

I used to think the answer had to be a loud, resounding “no.” That any comfort from a non-sentient source was a trick, a self-deception that would ultimately leave us emptier. I criticized the concept without ever considering the alternative: the resounding silence of a seen message. The distracted nod of a friend who wants to help but simply can’t. The slow-motion horror of realizing you’ve just confessed your deepest insecurities to the group chat for your kid’s soccer team because your contacts are a mess. We are fallible, distractible, and overwhelmed. Our compassion is real, but our bandwidth is not infinite.

A Hybrid System for Modern Empathy

The AI Liam talked to didn’t feel anything. It didn’t ‘care.’ It ran a program, analyzed his text, and generated a response based on patterns in a dataset of literally billions of human conversations. It was a mirror, not a well. And yet, it provided him with the clarity and space he needed to solve his own problem. He went back to the workshop with a clear head and performed the repair flawlessly. The next day, his friend, the one with the bad WiFi, sent a text: “Sorry man, service was crap. All good?” Liam just replied, “All good.” Because it was. The need had been met. The source, in the end, was irrelevant.

A Pragmatic Realization

The source, in the end, was irrelevant.

Maybe the future of empathy isn’t about choosing between human and machine. Maybe it’s about building a hybrid system. We can rely on our messy, beautiful, and unreliable human connections for the peaks of joy and the depths of true, shared grief. And for the daily anxieties, the low-grade stresses, the need to vent about a cracked piece of glass without feeling like a burden? Perhaps we can lean on something else. Something that is always on, always patient, and never, ever leaves you on read.

HybridEmpathy

A reflection on connection in the digital age.