AI for therapy
AI for therapy

The Risks Of Using AI For Therapy

Posted on March 6, 2026 by Henry Ford Health Staff
43

Artificial intelligence (AI) can answer almost any question in seconds, whether it’s recipe advice or how to ace a job interview. With their seemingly endless information, chatbots seem like a helpful resource when serious mental health concerns arise. After all, they’re low-cost or free, and always just a tap away.

Can AI take the place of a therapist when you need a listening ear? Chris Nixon, LMSW, an addiction medicine specialist at Henry Ford Health, discusses how to leverage AI resources and when to consult a professional for help.

Why AI Can’t Replace Humans

AI is an amazing tool, but it’s important to remember that a chatbot response is not a person (even though it sounds like one). “A major drawback to current applications of AI is the fact that you don't have a true human connection or empathy," says Nixon. "Both of these components are key to building trust between a therapist and client. AI can remember the facts you give it, but it doesn’t genuinely connect with you or care about you. That connection—that trust—is critical to effective therapy and ultimately, to getting better."

The lack of human connection isn’t the only issue. AI chatbots cannot replace mental health professionals because:

They can’t read nonverbal cues

“Therapists are specially trained to pick up on nonverbal cues from their patients,” Nixon says. “When you meet with a therapist in person or via video, they’re observing your body language, facial expressions, tone of voice and overall demeanor. These cues reveal much more than words you type into a computer.”

For example, have you ever seen someone say they’re “fine,” but their tone of voice, posture and eye contact tell a different story? “This mismatch is a sign that we need more information, and AI is not likely to detect this,” explains Nixon.

They don’t understand cultural differences

Social factors like access to healthcare, housing stability and economic resources significantly impact mental health. AI systems don’t understand these nuances when they’re replying to your mental health questions, often due to preexisting biases that may exist in contemporary datasets.

“Effective therapy is tailored to a person’s unique background and circumstances,” Nixon says. “Professional therapists receive training in cultural diversity and understand how culture, ethnicity and socioeconomic background impact a person’s life,” Nixon says.

They can validate dangerous thoughts

Harmful thoughts need intervention, not an echo chamber. When someone expresses thoughts of self-harm, a trained therapist assesses risk and connects them with immediate help. An AI chatbot may reinforce those feelings or let them slide without recognizing the urgency of the situation.

“If somebody is struggling with thoughts of harm to themselves or to another, there must be a connection with a professional as soon as possible,” Nixon says. “AI won’t connect you with emergency medical care, so it leaves a potentially dangerous gap there.”

Get Mental Health Support

Make an appointment with a physician or skilled mental health therapist.
Book now

When AI Can Help With Mental Health

Despite these limitations, AI can help with certain mental health needs. Consider using it for:

  • Gathering information: Instead of treating AI as a therapist, use it like a search engine. “AI is great at answering factual questions like, ‘What is cognitive behavioral therapy?’ You can often get a very comprehensive, detailed answer all in one place,” Nixon says.
  • Finding resources: AI tools can help you locate mental health services, support groups or other resources in your area.
  • Supporting professional therapy: Some AI-based tools, when used under the guidance of a mental health professional, show promise. “New AI tools could help people practice cognitive behavioral therapy or phobia exposure from home,” Nixon says.

AI could also assist clinicians with screenings, which could get patients the help they need more efficiently. “With a shortage of mental health professionals, AI tools could help with initial assessment so we know what care they need,” Nixon says. “However, this should always connect back with professional medical care, not replace it.”

Red Flags for Mental Health Issues

Today’s AI is designed to be a people pleaser, so it will likely just tell you what it thinks you want to hear. This approach is fine for everyday annoyances, like venting about a disagreement with your spouse or feeling stressed about a work assignment. However, this constant validation isn’t safe if you’re dealing with a mental health issue that requires treatment.

Regardless of what AI tells you, seek medical care if you experience:

  • Excessive worry or fear or sense of hopelessness
  • Extreme mood changes
  • Inability to perform daily activities or inability to see solutions
  • Loss of interest in activities you once enjoyed
  • Persistent negative thoughts or self-loathing
  • Sleep disturbances or changes in eating habits
  • Thoughts of harming yourself or others

“More than one in five U.S. adults has a mental health condition, so you’re not alone,” Nixon says. “Seeing a mental health professional is a brave, self-compassionate, self-loving thing to do.”


Reviewed by Chris Nixon, LMSW, CAADC, Administrative Director of Addiction Medicine at Henry Ford Health.
Categories : FeelWell
X

Cookie Consent

We use cookies to improve your website experience. By using this site, you agree to our Terms of Use. Read our Internet Privacy Statement  to learn what information we collect and how we use it.

Accept All
Dismiss