You, me, and Claude: This is what a relationship in the AI era looks like

When chatbot support replaces emotional risk, we lose our capacity for mess, friction, and vulnerability
- PUBLISHED: Mon 21 Jul 2025, 11:40 AM UPDATED: Sat 26 Jul 2025, 9:48 AM
A few weeks ago, a friend told me her teenage son had just entered his first relationship — with a twist. Before sending any message to his girlfriend, he runs it through Claude AI to ensure it’s “perfect.” Assuming she does the same, it’s effectively two bots talking to each other on their behalf— a far cry from the awkward, earnest interactions many of us remember from our youth.
What struck me wasn’t how strange this was but how normal it’s becoming. Artificial intelligence (AI) tools like Claude have become companions for many young people. They offer support, feedback, and a kind of digital mirror to help craft the most polished version of the self. But in doing so, they may also be shaping a generation less tolerant of imperfection and more alienated from real connection. For instance, as my friend aptly said, “In my day, I’d ask my friends what to say. That was part of the connection, both with them and with the boy I liked.” Now, that moment of closeness is outsourced to a bot.
Let me be clear: Claude isn’t the enemy. AI has enormous potential, from helping with schoolwork to enhancing creativity. I use it for everything from quick strategic thinking to proofreading this article. But the issue isn’t use; it’s unreflective overuse. When chatbot support replaces emotional risk, we lose our capacity for mess, friction, and vulnerability. In other words, we lose that very discomfort vital for growth.
Stay up to date with the latest news. Follow KT on WhatsApp Channels.
We Grow in Discomfort
We learn who we are not through perfect execution but through mistakes, embarrassment, and failure. If we perpetually outsource the messier parts of life to a tool designed for fluency and control, we risk becoming alienated from ourselves. Take another friend who asks ChatGPT whether their ideas are “good enough,” or if they’re “going to be okay” in their career. There’s nothing inherently wrong with this – we all need reassurance occasionally. However, the issue is that instead of turning to a friend or mentor, he turns to a bot. While that may feel safer, it denies them the chance to be seen by another human and to access the support they need, which can only truly be found in connection with another. This kind of excessive emotional reliance on chatbots is akin to putting a plaster over an infected wound – the pain will worsen, and you won’t see it happen.
Even in my life, I’ve noticed a creeping tendency to run everything — emails, thoughts, plans — through AI. It’s helpful, yes. But it also risks dulling the edge of intuition and detaching us from the discomfort of growth. I’m not alone in this — many people I speak to report a subtle decline in confidence, creativity, and connection, as if our emotional and professional muscles are atrophying through underuse.
More than 45 per cent of ChatGPT users are under 25 — the demographic most vulnerable to social dysregulation"
This intolerance to discomfort can have real consequences. Anxiety, loneliness, and social withdrawal will spike when human interactions are replaced by something “safer”. Over time, those unedited and raw moments with others — the bedrocks of intimacy — will become intolerable. Eye contact or phone calls? Forget them — they're way too risky. The awkward pause in a conversation? Intolerable.
In addition, while bots may be brilliant at giving information, their responses to emotional questions are often overly positive or confidently wrong. And yet, they’re being consulted more and more. Over 50 per cent of US adults now use large language models, and nearly 40 per cent use them regularly. More than 45 per cent of ChatGPT users are under 25 — the demographic most vulnerable to social dysregulation.
Worryingly, one poll even found that 80 per cent of Gen Zers said they could imagine marrying an AI, and 83 per cent believed they could form a deep emotional bond with one. These numbers point to a shift — AI tools are becoming emotional companions, not just productivity aids. But bots cannot give us what we need. They can’t meet our gaze, sit in our grief, or laugh with us in the mess. They simulate connection without ever providing it.
So What Can We Do?
We don’t need to ban Claude or panic about AI. But we do need to re-centre real connection. That means encouraging young people to tolerate awkward silences, send imperfect texts, say the wrong thing, and learn that they can survive it. It means making room for discomfort (again and again) until it becomes something they trust.
For those of us guiding them, maybe the best thing we can do is model what it looks like to be imperfect — to sit with someone struggling, not to fix them, but just to be with them. Because in the end, Claude offers polish but not presence. And it’s presence, not perfection, that makes us human.




