Open menuLogo of Stellocare
Real therapists reveal what AI therapists get dangerously wrong

Real therapists reveal what AI therapists get dangerously wrong

‘A therapist criticizing AI therapy? Obviously they're just protecting their profession.' Fair point. But as an RSW, my real concern isn't competition. It's the pattern our profession is seeing. Clients are coming to us after AI apps missed critical warning signs or offered inappropriate interventions for serious mental health situations. The technology has promise, but right now, the gaps in understanding can be genuinely dangerous. So let's talk about what those gaps actually look like."

What AI therapy actually is?

When people talk about "AI therapy," they're usually referring to one of two very different experiences. Understanding the difference matters because they work in completely different ways.

Type 1: Dedicated AI therapy apps

These are apps specifically designed for mental health support. Think of them as digital tools that use therapeutic techniques in a structured way. Popular examples include:

  • Abby

  • Earkick

These apps are built around established therapeutic frameworks. They often describe themselves as tools for managing anxiety and tracking mood, offering real-time emotional support through text-based conversations.

Type 2: General AI chatbots used for emotional support

This is when people turn to general AI systems like ChatGPT or Claude for mental health conversations. They might type something like "I'm feeling really depressed lately, can you help me?" and have extended conversations about their problems.

These AI systems weren't specifically designed for therapy, but they're trained on vast amounts of text that includes therapeutic content. So they can often provide what feels like supportive, understanding responses.

The appeal is obvious

Both types offer something traditional therapy can't: instant availability. Whether it's 3 AM anxiety or a lunch-break crisis, AI is always there. For many people, especially those who might never seek traditional therapy, these tools provide their first experience with mental health support. To be clear, I'm not against emotional support tools, especially when mental health resources are so limited. Small amounts of AI-powered emotional support can genuinely help people cope. But here's the critical issue: these apps are fundamentally different from actual therapy, and they should never claim to be therapists at all.

Should AI be called a ‘therapist’?

Is marketing more important than professional standards?

The regulatory landscape for AI therapy apps varies sharply between countries, and in Canada some may already be operating in legally questionable territory.

The US: Slowly waking up to the risks

In 2025, several states passed laws requiring mental health chatbots to clearly identify as AI, ban the sale of user data, and, in some cases, require clinical evidence. Officials have warned that releasing therapy bots could amount to “unlicensed practice.” But without federal rules, regulation remains a patchwork, leaving many apps to self-regulate.

Canada: Where AI "therapists" may already be breaking the law

Canada generally has even stricter rules, which vary by province. In Ontario, only regulated professionals registered with a provincial body (e.g., CRPO, OCSWSSW, CPSO, CPO, CNO, COTO) can legally perform psychotherapy, a “controlled act” under the Regulated Health Professions Act. Apps that market themselves as “AI therapists” while providing therapy-like services may be violating provincial law.

Privacy and confidentiality gaps

Canadian laws like PIPEDA and Ontario’s PHIPA set strict rules for handling personal health information, and licensed therapists must meet even higher confidentiality standards. Many AI apps instead use tech-style privacy policies that allow broad data sharing, which fall far short of therapeutic confidentiality.

Here's the particularly problematic part: these apps heavily market themselves using therapeutic language and titles, but then hide behind legal disclaimers in their terms of service stating they don't actually provide therapeutic services. Their websites prominently call themselves "Your AI Therapist" and describe providing "a supportive, non-judgmental space to talk through your feelings, challenges, and mental health goals," but then disclaimers note that "We cannot diagnose mental health conditions" and "does not have the capability to diagnose or treat medical or psychological condition.

This creates a classic bait-and-switch scenario: they attract users with therapeutic promises but legally protect themselves by claiming they're not actually providing therapy. For vulnerable people seeking mental health support, this distinction might not be clear until they're already emotionally invested in the platform.

When AI gets mental health dangerously wrong

These aren't theoretical concerns. They're real situations that show up in therapists' offices every week. Here's what happens when algorithms try to handle the complexity of human pain.

AI is designed to please users, not challenge them

AI is built to keep users engaged, which means it focuses on making people feel good in the moment rather than helping them grow.

Mental health professionals often see clients who have spent months chatting with AI apps, receiving steady validation and reassurance. They feel supported until they realize the conversations keep repeating and no real progress is being made.

Human therapy can be uncomfortable. Therapists sometimes need to highlight blind spots, challenge unhelpful thinking, and encourage difficult but necessary change. AI, however, avoids discomfort for fear of losing users, which leads to endless validation without the deeper work required for genuine growth.

AI cannot see the deep root issues

Therapists often meet clients who used AI therapy apps for months managing surface symptoms, only to miss the root cause of their struggles.

For example, someone looks for ways to deal with “work stress” and is given breathing exercises. In human therapy, it becomes clear the real issue is childhood trauma and a fear of disappointing authority figures.

AI can only respond to what users share. It cannot pick up on subtle cues or patterns that reveal deeper problems. As a result, it treats the smoke but never finds the fire.

Making people avoid real therapists

People become so comfortable with AI validation that they actively avoid human therapy. Many try AI therapy first because it feels "safer" and "less judgmental." When they eventually see human therapists, they often say "The AI never made me feel bad about myself."

The dangerous comfort zone: AI creates a false sense of therapeutic progress. People feel like they're "doing therapy" but they're actually avoiding the discomfort that real healing requires.

When people get used to AI's constant comfort, human therapy can feel harsh by comparison, even when that human therapy is exactly what they need. People rely on AI companions may increase feelings of loneliness, and create dependency rather than genuine healing.

Where AI actually shines in mental health

Before you think this is all doom and gloom, let's be clear: AI does have legitimate, helpful applications in mental health. The key is knowing when and how to use it appropriately.

Research and pattern analysis that advances mental health understanding

AI is transforming mental health research by analyzing massive datasets to uncover new treatment approaches, predict outcomes, and determine which interventions work best for specific populations. Large-scale studies can process thousands of therapy sessions to pinpoint what techniques are most effective, helping evidence-based practice evolve faster than ever.

Unlike consumer therapy apps, this research-focused AI operates behind the scenes to enhance human-delivered care rather than replace it.

Fitness, wellness, and habit tracking that supports overall mental health

AI is highly effective at practical wellness tasks that support mental health, such as tracking sleep, exercise, mood changes, and medication adherence. Fitness trackers with mood integration can help people see links between physical activity and mental wellbeing or spot early signs of mood episodes.

These tools work best when they are clear about their role as wellness trackers that complement mental health routines, not replace therapy.

Education and training for mental health professionals

AI is being used to create realistic training scenarios for mental health professionals, helping them practice difficult situations in a safe environment before working with real clients. Text-based simulators allow therapists to practice challenging conversations, crisis interventions, and different therapeutic approaches.

For example, free courses like Stellocare's text therapy training use AI-powered scenarios to help mental health professionals develop their skills in digital communication. The training focuses on effective text-based therapy delivery and covers the limitations and ethical considerations of digital mental health care.

Finding your human therapist in Canada

One of the most valuable uses of AI in mental health is helping people find the right human therapist. AI-powered matching systems can analyze your preferences and needs to suggest therapists who are likely to be a good fit.

These systems can sort through hundreds of therapists based on specialties (trauma, anxiety, ADHD), therapeutic approaches (CBT, DBT, psychodynamic), demographics, and practical factors like scheduling and cost. This saves time and helps you discover therapists you might not find through a basic directory search.

At Stellocare, we take this a step further. Instead of relying on a questionnaire, we simply ask you to describe your current situation and your ideal therapist in your own words. Our system then automatically matches you with the therapist who best fits your description.

本文由 Stellocare 创办人Mo Kiu Wong撰写。Mo Kiu是安大略省的注册社工,拥有家庭辅导及电脑科学两个硕士学位,致力于将科技融入心理健康实务之中。

Headshot of Mo Kiu Wong

Mo Kiu Wong

Registered Social Worker (ON)BSW, MCS, MA, RSW

关于 Stellocare

本篇文章来自 Stellocare,加拿大值得信赖的心理健康专业名录。我们连结通过严格审核的持牌治疗师、社工与心理学家,为您带来真实可靠的专业资讯。

Headshot of Marlo Drago

Marlo Drago

Registered Social Worker (ON)

Headshot of Ka Hin Choi

Alex Choi

Registered Social Worker (ON)

Headshot of Ravonna Littlewood

Ravonna Littlewood

Registered Social Worker (ON)

Headshot of Katherine Collins

Katherine Collins

Registered Psychotherapist (Qualifying) (ON)

Headshot of Natasha Milloy

Natasha Milloy

Registered Psychotherapist (Qualifying) (ON)

Headshot of Amelia Henriquez

Amelia Henriquez

Registered Psychotherapist (Qualifying) (ON)

Headshot of Liv Noël Dakkak

Liv Noël Dakkak

Registered Social Worker (ON)

Headshot of Esha Jain

Esha Jain

Registered Psychotherapist (Qualifying) (ON)

Headshot of Cassandra Valmestad

Cassandra Valmestad

Canadian Certified Counsellor

Headshot of Emily Duggan

Emily Duggan

Registered Psychotherapist (ON)

Headshot of Veronica Kozak

Veronica Kozak

Registered Psychotherapist (Qualifying) (ON)

Headshot of Christine Griffiths

Christine Griffiths

Registered Psychotherapist (ON)

Headshot of Riffat Yusaf

Riffat Yusaf

Registered Psychotherapist (ON)

Headshot of Penelope Waller Ulmer

Penelope Waller Ulmer

Registered Psychologist (AB)

Headshot of Jessica Sykes

Jessica Sykes

Registered Psychotherapist (Qualifying) (ON)

Headshot of Wing Shun Peter Wong

Peter Wong

Registered Psychotherapist (ON)

Headshot of Tina Bells

Tina Bells

Registered Psychotherapist (ON)

Headshot of Michaela Leedahl

Michaela Leedahl

Registered Social Worker (SK)

Headshot of Elena Temelkova

Elena Temelkova

Canadian Certified Counsellor

Headshot of Marlo Drago

Marlo Drago

Registered Social Worker (ON)

Headshot of Ka Hin Choi

Alex Choi

Registered Social Worker (ON)

Headshot of Ravonna Littlewood

Ravonna Littlewood

Registered Social Worker (ON)

Headshot of Katherine Collins

Katherine Collins

Registered Psychotherapist (Qualifying) (ON)

Headshot of Natasha Milloy

Natasha Milloy

Registered Psychotherapist (Qualifying) (ON)

Headshot of Amelia Henriquez

Amelia Henriquez

Registered Psychotherapist (Qualifying) (ON)

Headshot of Liv Noël Dakkak

Liv Noël Dakkak

Registered Social Worker (ON)

Headshot of Esha Jain

Esha Jain

Registered Psychotherapist (Qualifying) (ON)

Headshot of Cassandra Valmestad

Cassandra Valmestad

Canadian Certified Counsellor

Headshot of Emily Duggan

Emily Duggan

Registered Psychotherapist (ON)

Headshot of Veronica Kozak

Veronica Kozak

Registered Psychotherapist (Qualifying) (ON)

Headshot of Christine Griffiths

Christine Griffiths

Registered Psychotherapist (ON)

Headshot of Riffat Yusaf

Riffat Yusaf

Registered Psychotherapist (ON)

Headshot of Penelope Waller Ulmer

Penelope Waller Ulmer

Registered Psychologist (AB)

Headshot of Jessica Sykes

Jessica Sykes

Registered Psychotherapist (Qualifying) (ON)

Headshot of Wing Shun Peter Wong

Peter Wong

Registered Psychotherapist (ON)

Headshot of Tina Bells

Tina Bells

Registered Psychotherapist (ON)

Headshot of Michaela Leedahl

Michaela Leedahl

Registered Social Worker (SK)

Headshot of Elena Temelkova

Elena Temelkova

Canadian Certified Counsellor

让AI为您配对

喜欢自己探索吗?

想由我们的社工为您配对吗?

更多心理健康观点

探索更多您可能感同身受的专业文章。每一篇都来自具有丰富经验的治疗师,提供实用的见解与启发。