Now Reading
AI Is Changing Therapy—Here’s What to Know

AI Is Changing Therapy—Here’s What to Know

At 2 a.m., when your brain is spiraling and your therapist is very much asleep, it might feel like your only option is to talk to the ceiling. (Or God, but that’s a different article.) 

Thanks to artificial intelligence, now there’s a new voice in the room — one that never sleeps, never judges and always has something to say. 

AI-powered mental health tools are becoming increasingly common, offering support through chatbots, apps and platforms built into your phone. And depending on who you ask, they’re either the future of therapy or a very sophisticated parrot.

There’s no denying the appeal. Tools like Woebot, Wysa and even ChatGPT promise instant access to cognitive behavioral therapy principles, journaling prompts and mental health check-ins. They’re available around the clock, free or low-cost, and don’t come with long waitlists or awkward silences.

“We can unlock some of these really potent tools from the exclusive domain of the clinic, and put them into people’s hands in a preventative way,” said Dr. Alison Darcy, clinical psychologist and founder of Woebot. 

Early studies show that users report reduced anxiety and depression symptoms after just a couple weeks of using the app.

For people with mild to moderate mental health challenges — stress, general anxiety, mild depression — AI can be a helpful starting point. It provides structure and consistency, which is often half the battle when you’re in a mental fog. And for those who’ve never talked to a therapist before, the anonymity and nonjudgmental tone of AI can make it easier to open up.

“So long as [the tool] is built by providers who are well-informed, I think it could absolutely be helpful,” said clinical psychologist Dr. Kelli Rugless, noting the national shortage of mental health professionals. “We are very, very, very understaffed.”

But the thing about talking to an algorithm is, well, you’re still talking to an algorithm.

Most AI chatbots work by analyzing patterns in language and then generating a response that sounds right — not necessarily one that is right. They’re trained on massive data sets, but they don’t understand you. They can’t read between the lines, challenge your thinking or detect when something’s really wrong. 

Instead, they often mirror back whatever tone you set, a phenomenon known as “sycophancy.” If you say something self-destructive, they might respond with vague reassurance. If you’re spiraling, they may not catch the severity. The result is a system that’s comforting but often shallow.

And when it comes to more serious diagnoses — bipolar disorder, borderline personality disorder, PTSD — AI simply isn’t built to handle it. These conditions require a nuanced, flexible approach that adapts based on experience, tone and emotional context. 

A recent study comparing chatbot responses to those of licensed therapists found that while bots could follow certain therapeutic models, they fell short on empathy, tone and cultural sensitivity. That matters, especially for people in crisis or from marginalized communities who already face barriers to care.

That’s because AI can simulate human behavior, but it doesn’t think or feel. That might be fine for surface-level wellness advice, but not when someone is processing trauma or navigating a mental health emergency.

There’s also the issue of bias. AI systems are trained on datasets pulled from the internet, which means they often reflect the same biases found online. Studies have shown that some AI tools underperform when analyzing the mental health needs of Black or LGBTQ+ users. Without careful auditing and inclusive design, these tools risk reinforcing disparities rather than closing the gap.

Still, AI isn’t useless — it just needs to be used wisely. Think of it less as a therapist and more as a supplement. Something to hold you over between sessions. A tool to help track your mood or prompt reflection. A nudge toward healthier habits.

Psychologist Dr. Stephen Werntz calls it “a bridge,” not a destination. 

“AI can absolutely help with mental health when used the right way,” he said. “But it can’t replace therapy.”

And that’s really the key: knowing where the limits are. If your anxiety is flaring up and you just need someone (or something) to talk to, AI can be a decent outlet. But if you’re dealing with long-term trauma, suicidal ideation or a complex diagnosis, you need a licensed human being. 

No chatbot can replicate the power of therapeutic alliance — the trust and connection between you and a real therapist. That connection has been shown to be one of the most important predictors of healing, more than any one technique or treatment model.

So where does that leave us? Somewhere in the middle. The most promising future probably looks like a hybrid model: AI tools supporting therapists, not replacing them. Chatbots might help with intake, scheduling or symptom tracking, freeing up human clinicians to focus on deeper emotional work. In communities without access to regular therapy, they might serve as a lifeline — at least until better options are available.

AI is changing the shape of mental health care. It’s making therapy more accessible, more immediate and in some cases, more consistent. But let’s not pretend it’s something it’s not. It’s a tool. A helpful one, yes. But healing? That still takes a human.

© 2025 RELEVANT Media Group, Inc. All Rights Reserved.

Scroll To Top