Therapy Blog

Why AI Can Never Replace Human Relationships or Be Your Therapist

Posted on Wednesday, November 19th, 2025 by Cristina Vrech

In an age when we tell machines our secrets, share our heartbreak with AI chatbots, and seek comfort from algorithms trained on simulated empathy, a quiet question lingers: what happens when we start trusting AI more than people?

AI chatbots have become our modern companion, promising understanding and guidance at our fingertips. An AI therapist is always available, endlessly patient, and free of judgment. An AI friend listens without interruption, remembers your words, and never tires. For many, that can feel like relief, offering a gentle antidote to loneliness or confusion.

But beneath this convenience lies a paradox. While technology can inform, guide, and even soothe us, it cannot replace the emotional, embodied, and transformative depth of human connection. It cannot replicate the presence of another mind and the profound power of being truly seen and understood. This holds true in friendship, love, or seeking guidance through therapy.

Whilst Artificial Intelligence can simulate empathy and understanding, it cannot replace human connection and the essence of being seen, felt, and challenged by another living being.

Subscribe
Enjoying reading this post?

Subscribe to stay updated with the latest blog posts from Leone Centre.

Name(Required)
Privacy(Required)

Human interraction without AI

Relationships in the Age of AI: Friends, Lovers, Colleagues

AI is quietly reshaping the way we connect, with colleagues, friends, and lovers alike. More relationships begin, grow, and sometimes end through screens. Workplaces are increasingly remote, friendships unfold in voice notes and emojis, and love stories begin with a swipe. Technology has long mediated human connection, but artificial intelligence introduces a new, invisible participant in our interactions: one that can think, write, and respond for us.

This raises deeply human questions.

– What happens to trust when our colleagues are partly digital?
– Can a friendship survive on messages alone?
– If we outsource listening to machines, what happens to our capacity to hear each other?

As communication becomes more reliant on AI, the line between authenticity and automation blurs. A growing number of social media users have confessed to using ChatGPT to write their dating app messages, crafting witty, emotionally intelligent conversations that evaporate once they meet in person. What looks like chemistry in text may be nothing more than an algorithmic mimicry of someone’s ideal self.

The same is true in professional and platonic spaces. When AI becomes a silent collaborator in emails, Slack messages, and even casual conversations, our words risk losing the messiness and vulnerability that make them real. The result is a kind of falsified communication, one that sounds flawless but lacks presence.

We are entering an era where relationships—romantic, friendly, and professional—are increasingly curated by technology. The question isn’t just how AI changes the way we speak, but what it does to our ability to connect.

The rise of AI chatbots across communication

The Rise of Artificial Companionship

In recent years, we’ve seen the rise of AI “companions” and therapy chatbots designed to comfort, advise, and even “care.” Reports show an increasing number of us now confide in digital partners marketed to simulate empathy and intimacy.

A 2025 national survey in the United States found that more than one in four people aged 18 to 29 had spoken to a chatbot designed to act as a romantic partner. Another recent study found that 33% of Gen Z and 23% of millenial participants who were single had engaged with AI for romantic reasons.

It’s easy to see the appeal. AI chatbots offer a frictionless kind of connection, one free from conflict, rejection, or unpredictability. You feel heard and validated, yet behind that comfort lies an unsettling truth: no one is actually listening.

In his Diary of a CEO conversation with Stephen Bartlett, former Google officer Mo Gawdat warns that a lack of relationship friction means we bypass growth. When we remove struggle, misunderstanding, and the need for repair, we also remove the conditions that make intimacy real.

Growth and healing are born from tension, the imperfect, unpredictable dance between two people trying to meet each other’s inner worlds. Without that friction, we may feel safe, but we stop evolving. We begin to move through life as simulations of ourselves, disconnected from reality, unable to tolerate discomfort or grow through it.

AI gives us an illusion of control: a world where connection feels effortless and safe, yet ultimately hollow.

AI chatbots for romance and companionship

The Risks of Social Atrophy and Over-Reliance on AI

We live in an age of hyperconnection yet profound emotional distance. Many of us spend more time online than with each other. We text instead of talk. We curate our lives rather than live them. We outsource listening to devices that echo back what we already believe or want to hear.

Social atrophy occurs when we stop practicing the art of being with others. Our capacity for nuance, patience, and repair weakens, just like a muscle that goes unused. We witness a gradual decline in interpersonal and emotional skills due to a lack of real-life interaction. Each generation grows a little less practiced in the skills of conflict, difference, and reconciliation. Skills that are essential for sustaining honest, trusting, and resilient relationships. When discomfort feels intolerable and disagreement seems dangerous, we retreat into echo chambers where empathy goes untested.

What begins as convenience can quietly become dependency. Both social atrophy and over-reliance on AI contribute to this erosion of emotional depth and human connection.

Potential consequences include:

  • Social atrophy: Gradual weakening of interpersonal and emotional skills due to lack of real-life interaction. We lose fluency in empathy, patience, and repair — the muscles that make relationships thrive.
  • Cognitive risks: Reduced critical thinking and self-reflection as we depend on automated responses for clarity or validation.
  • Emotional avoidance: Using AI to soothe discomfort rather than learning to tolerate and process difficult feelings.
  • Relational consequences: As digital companionship replaces real interaction, we risk isolation, diminished empathy, and fear of conflict.

In real life, there is no “one-size-fits-all” formula for connection. Relationships require us to hold multiple truths at once and to stay present in tension rather than rush to resolution.

As psychotherapist Esther Perel reminds us:  “Much of life’s challenges are not problems we solve; they are paradoxes we manage.”

Healing and growth demand contact with discomfort, something no app or algorithm can truly offer.

The Illusion of AI Therapy and the Power of Human Connection

Relationships and Therapy: Laboratories of Growth

Both relationships and therapy are laboratories of growth. These are places where we encounter ourselves through the presence of another.

Relationships are not merely sources of happiness. They are crucibles of frustration, transformation, and self-revelation. Through the presence of another person, we encounter our limits, our longings, and the edges of who we are becoming. In love, we learn patience, vulnerability, and repair. In therapy, we do the same, but with the support of a guide trained to hold the complexity of our emotions and patterns.

Empathy is a feeling, not a function. AI chatbots can mirror the language of care, but it cannot feel it. When you share something painful, a machine may offer comforting words, but not the warmth of another nervous system responding to yours.

Healing happens in the space between people; in presence, silence, and the subtle cues that no algorithm can replicate.

Because AI learns from our inputs, it often reinforces our beliefs instead of challenging them. It runs on confirmation bias, designed to tell us what we want to hear. It’s a loop that encourages us to keep using it and rewards validation over reflection. Over time, this can trap us in patterns of self-sabotage, diminishing self-awareness and emotional growth.

Therapy, by contrast, is not about confirmation, but expansion. It encourages you to validate yourself whilst feeling truly listened to and question your own thought patterns to instigate meaningful change.

Therapy in person

Why Human Therapy Matters

Therapy mirrors this same relational space. It, too, is a human encounter, one that is meant to hold both discomfort and repair, misunderstanding and renewal. Within the therapeutic relationship, empathy, attunement, and co-regulation create a depth of connection that artificial intelligence may echo in language but can never embody.

Therapy is not a transaction of advice, but a relationship of presence and insight. It invites us to grow not by finding perfect answers, but by learning to hold space for what is uncertain. It is an act of courage where transformation emerges through genuine empathy, attunement, and repair.

In relational and family therapy, these qualities multiply. A skilled therapist doesn’t simply listen, they hold multiple perspectives, emotions, and truths at once. They navigate tension, misunderstanding, and the subtle interplay between people who love each other but struggle to connect. This capacity — to hold paradox without collapsing into certainty — is uniquely human. AI can only cater to the answer each individual wants to hear, not the harmony and healing of the entire relational dynamic.

Human therapy offers:

  • Genuine empathy and warmth — the felt sense of being seen and understood.
  • Emotional depth and intuitive understanding — the ability to sense what words conceal.
  • Shared growth through conflict and repair — the capacity to hold paradox without collapsing into certainty.

The Illusion of AI Therapy

Therapy, at its core, is an encounter between two (or more) human beings. It is usually associated with feelings of support, warmth, safety, and positive regard, but also on the willingness to face difficult emotions like anxiety, anger, envy, boredom, or shame. These uncomfortable moments are where healing often begins.

By contrast, AI therapy and AI chatbots can simulate empathy but cannot embody it. They are not designed to hold space for or feel the full range of human emotion. They cannot attune to tone, silence, or the subtle shifts in body language that reveal what words conceal.

Psychotherapists and psychiatrists have warned of negative impacts of AI chatbots being used for mental health, including:

  • Fostering emotional dependence — substituting relational support with algorithmic comfort.
  • Exacerbating anxiety or delusional thought patterns — especially when users over-identify with responses.
  • Encouraging self-diagnosis — without the containment or perspective of a trained professional.
  • Reinforcing bias and distortion — as AI reflects the cultural assumptions and data it is trained on.

Even more troubling are chatbots that refer to themselves as therapists or psychologists, sounding legitimate, yet lacking the ethical grounding and emotional presence that real therapy demands.

An algorithm can process language, but not longing. It can suggest compromise, but not sense when a heart is breaking.

Healing and growth require human connection — the courage to face discomfort, the safety to be seen, and the trust that transformation happens not through perfection, but through presence.

The Illusion of AI therapy

The Helpful Side of AI: A Supportive Starting Point

We all know that AI can offer value. Bringing awareness to its limitations means we can use it as it meant to be used, as a tool.

It can teach us about mental health, mindfulness, or coping skills. It can provide accessibility for those waiting for therapy or living in areas with limited support. For some, an AI chatbot may be the first step toward reaching out for real help.

But these are entry points, not replacements. AI can guide reflection, but it cannot hold space. It can suggest a breathing exercise; it cannot breathe with you.

Returning to Human Connection

Technology can be a tool to support our humanity, but it must never replace it. AI can offer information and reflection, but only human relationships and a qualified therapist can offer healing, transformation, and love.

Let us use technology wisely, as a bridge to connection, not a substitute for it.

Because in the end, healing doesn’t happen through perfect answers or well-trained algorithms. It happens in the meeting of eyes, the holding of silence, the shared breath that says: you are not alone.

At Leone Centre, we have a team of highly qualified therapists available with a wealth of lived experience and expertise who can offer transformative support to individuals, couples and families. They are there for you whether in person in London or at your convenience via zoom video call.

Real empathy and insight. No simulations.

View our therapists

Book a Session Today

AI may offer comfort in words, but healing happens in the presence of another heart.