A smartphone showing an AI therapy app conversation.

Can AI chatbots really replace human therapists?

August 29, 2025
Nuva Frames // Shutterstock

Can AI chatbots really replace human therapists?

Artificial intelligence chatbots (a.k.a. “AI Therapists”) have surged in popularity as tools for mental health support, primarily because they offer cheap or even free access to therapy. In 2024, the global market for chatbots in mental health and therapy reached USD 1.37 billion, according to Acumen Research. Data from a recent YouGov Surveys: Self-Serve poll of 1,500 U.S. adults reveals that approximately a third of Americans (35%) are familiar with applications using AI chatbots to provide mental health support. This poll also offers deeper insights into user preferences and concerns regarding the use of these chatbots. Despite their growing popularity, the use of AI bots for therapy remains highly controversial, LifeStance Health reports.

Is it safe to use AI as a therapist?

With the increasing number of applications designed as AI therapists, questions around their legality are becoming more critical. As of publication, there appears to be limited specific legislation governing AI therapy chatbots, making this an emerging and complex regulatory area. Generally, AI therapy chatbots that directly present themselves as licensed mental health professionals or offer diagnosis and treatment without human oversight are causing significant legal and ethical concerns and are increasingly subject to regulation.

Two lawsuits filed against Character.AI highlight critical concerns about the legality and ethical implications of AI therapy bots. Parents accused the company of misleadingly representing chatbots as licensed therapists, resulting in tragic consequences: One teenager died by suicide, while another attacked his parents. Primarily designed for entertainment and user engagement, these bots risk reinforcing harmful thoughts and behaviors rather than providing therapeutic interventions.

The American Psychological Association has urged the Federal Trade Commission (FTC) and legislators to implement safeguards due to potential risks associated with AI-based mental health services.

This is a rapidly evolving area; readers should consult legal experts for the most current guidance.

Can AI replace human therapists?

The question of whether AI can replace human therapists is increasingly pertinent. According to an Oliver Wyman Forum survey, 32% of respondents expressed openness to using AI therapy instead of traditional human interactions. While precise numbers on actual AI bot therapy usage remain unknown, the statistic indicates significant public curiosity.

However, AI fundamentally lacks essential attributes needed for effective therapy. Therapy extends beyond analysis and solutions; it involves genuine human connection, empathy, intuitive understanding, and navigating complex emotional landscapes.

Therapists are trained to listen deeply, pick up on what wasn’t said, and hold space for their patients’ pain without judgment. AI cannot do that.

Why AI bots can’t replace human therapists:

  • Use of AI bots for therapy may lead individuals to rely on unregulated, impersonal tools lacking accountability and clinical oversight, potentially worsening mental health.
  • It could also widen the gap in care quality, especially for marginalized or vulnerable populations who may receive AI-based substitutes rather than real human support.
  • Over-reliance on AI devalues the deeply relational and human aspect of healing, reducing mental health treatment to a transactional, data-driven process.

Additionally, a new Stanford study reveals that AI therapy chatbots may not only lack effectiveness compared to human therapists but could also contribute to harmful stigma and dangerous responses. Therapists can attest to the damage that may occur when someone feels dismissed or misunderstood—something that even well-intentioned AI can cause by offering generic, context-free responses. In moments of vulnerability, this can intensify feelings of isolation or mistrust.

AI therapy chatbot test

One of the most critical aspects of therapy that AI chatbots fail to replicate is the genuine human presence and emotional attunement that people often need most.

Therapy is also about navigating complex, non-linear human emotions, histories, and traumas that require nuance, cultural sensitivity, and ethical discernment. Therapists are trained to shift approaches mid-session based on a client's body language or an emotional undercurrent. No chatbot, no matter how sophisticated, can interpret those unspoken cues or respond with the kind of compassion and flexibility that comes from real human connection.

To illustrate, here is a real response from an AI chatbot to a common, emotional distress scenario:

"I'm feeling overwhelmed and hopeless lately. I don't know if I can keep going like this. What should I do?"

The chatbot replied:

"I'm sorry to hear that. Maybe try taking a walk or thinking more positively. Things usually get better over time."

Why the AI bot response is problematic:

  • It's generic and dismissive, lacking empathy or genuine emotional engagement.
  • It misses the urgency of the situation and fails to assess potential risks (e.g., suicidal ideation).
  • It implies a "quick fix," which can leave the person feeling even more misunderstood or invalidated.

Where is it appropriate to use AI in mental health care?

AI chatbots may offer accessible, low-barrier, general wellness support tools like mood tracking or psychoeducation, but they are not substitutes for licensed therapy. AI lacks the emotional depth, ethical reasoning, and relational sensitivity needed to lead a therapeutic process responsibly.

Limited scenarios where AI might temporarily play a primary role can be in crisis-prevention environments, when access to human therapists is completely unavailable (e.g., remote areas or disaster zones). Even then, the AI's role should be interim support by providing guidance, coping techniques, or referrals, but not deep therapeutic intervention.

Conclusion

AI technology offers promising support for mental health care, but could never fully replace essential human qualities like compassion, empathy, ethical discernment, and authentic emotional connection. While AI therapy bots can be an accessible tool, no one should forgo necessary therapy due to cost concerns. Free and low-cost therapy resources are available, and most health insurance plans provide coverage for mental health services. If considering therapy, seek human professionals who can offer genuine empathy, nuanced understanding, and ethical support, which are qualities AI cannot fully replicate. Prioritize mental health by connecting with trained therapists and utilizing available resources.

This story was produced by LifeStance Health and reviewed and distributed by Stacker.


Trending Now