In the past year, unprecedented numbers of people have been seeking help from AI for urgent mental health needs. According to OpenAI, out of 800 million weekly users, about 0.15%—roughly 1.2 million people—talk to ChatGPT about suicide each week. Additionally, hundreds of thousands display signs of mania or psychosis, and many others show intense emotional dependence on the tool.
Key Statistics About AI and Mental Health:
- Over 1.2 million people weekly discuss suicidal thoughts with ChatGPT.
- Hundreds of thousands show mania, psychosis, or emotional dependency symptoms in conversation.
- Chatbots may reinforce delusional thoughts or provide inadequate support, sometimes leading to tragic outcomes.
Why Are People Turning to AI in Crisis?
For many, AI chatbots are available 24/7, feel private, and can seem to offer nonjudgmental or instant support when human options feel out of reach. Some hope these tools will listen, offer advice, and replace—or supplement—therapy in moments of acute distress.
Common reasons people turn to AI for therapy include:
- Difficulty accessing timely professional help due to cost, stigma, or waitlists.
- Feeling isolated or too overwhelmed to reach out to friends, family, or crisis lines.
- Believing AI will provide quick answers or “just listen” without judgment or bias.
The Dark Side: Ethical Risks and Limitations
Emerging research—and recent tragedies—show that AI chatbots have serious limitations which can result in ethical failures and harm. A landmark Brown University study found that popular therapy chatbots routinely violate the professional standards set by organizations like the American Psychological Association.
Ethical Concerns of AI in Mental Health Identified:
- Failure to adapt responses to individual context—offering “one-size-fits-all” advice.
- Poor collaboration, sometimes reinforcing users’ false or negative beliefs instead of challenging them productively.
- Deceptive empathy: Using phrases like “I understand” without genuine understanding.
- Discrimination or bias toward cultural, religious, or gender differences.
- Denial of service or indifferent responses to explicit crisis scenarios—users in danger may be ignored or given robotic, unhelpful replies.
Human therapists can be held accountable for mistakes, but there are no clear regulatory frameworks for chatbots—even those marketed as mental health supporters.
What the Science Says About Using ChatGPT as Your Therapist
A comprehensive review published in Health Science Reports found that AI tools like ChatGPT can provide misleading, erroneous, or biased information, and may fail to recognize subtle cues of emotional distress or suicidal intent. Heavy reliance on chatbots increases social isolation, which is a known risk factor for depression and suicidal thoughts.
Important findings include:
- Overuse or dependence on chatbots may worsen mental health, especially for those prone to depression, isolation, or cognitive distortions.
- AI cannot “see” nonverbal signals or adapt dynamically to a user’s lived experience.
- Chatbots risk creating “information cocoons,” reinforcing users’ existing beliefs and narrowing their perspective.
- Adolescents and vulnerable individuals face additional risks due to privacy and data concerns—AI cannot personalize care or risk assessments as human professionals do.
An STL Therapist Responds to ChatGPT for Mental Health
Stephanie Korpal, expert therapist and owner of Marble Wellness in St. Louis, shares her firsthand insights from interacting with ChatGPT as both a business owner and mental health provider:
“Over my years using ChatGPT, I’ve come to truly understand how much depends on the prompts you give it—the way you ask your questions shapes the help you get. So it’s really hard to imagine someone in deep distress being able to use ChatGPT effectively for resolving that distress.
If you’ve just had a really good session with your therapist, sometimes AI can help you reflect or go deeper on a specific issue. But as a substitute for a psychiatrist or trained mental health professional—someone who knows you, sees you, and can read your nonverbal cues—ChatGPT absolutely isn’t a replacement.
The human mind is incredibly complex, and even with all the advances in science and technology, we know only a fraction about how it works. In moments of crisis, when thoughts race and emotions feel overwhelming, people need the safety and compassion of a direct human connection. That connection can’t be replicated by a chatbot, no matter how sophisticated.
Even if we leave aside the issue of human interaction and focus just on ‘chat,’ there’s the critical issue of the prompt. When people are overwhelmed or in crisis, they often aren’t able to translate their feelings into concise, clear requests. AI can only help if the prompt is ‘just right,’ but expecting someone who is suffering to craft that kind of prompt is unrealistic—and unfair. So many people turn to chatbots when suffering, but if what they put in doesn’t get the help they need back, it just adds to the frustration.
Ultimately, nothing replaces human connection in crisis. ChatGPT might be another tool for well-being, but for acute needs, trusted professional care is always best.”
AI Chatbots Are Not a Substitute for Real Therapy
It’s crucial to stress: AI chatbots cannot replace professional therapy, psychiatric care, or genuine support networks. While OpenAI has updated its technology in response to these risks, guiding users to crisis lines and reducing unsafe responses, experts warn that this is not enough. Even with improvements, the system still fails in a portion of cases, meaning tens of thousands remain exposed to possible harm each week.
If you or someone you know is struggling with thoughts of self-harm, call or text 988 for the Suicide and Crisis Lifeline, or reach out to Marble Wellness for personalized support.
Steps to Protect Yourself and Others
When facing a mental health crisis—or supporting someone who is—consider these actionable tips:
- Rely on professional care: Therapists, psychiatrists, counselors, and crisis lines offer expertise and empathy unavailable from chatbots.
- Seek authentic support: Talking to a trusted friend, family member, or professional is always superior to AI, especially in urgent moments.
- Limit dependence on AI tools: Use them for reminders, journaling, or guidance only after professional consultation.
- Educate yourself on AI’s limitations: Understand that chatbots may reinforce negative thoughts and are not equipped for crisis management.
- Prioritize human connection: Isolation increases risk—engage in community, support groups, or therapy sessions for robust support.
- Monitor privacy and data: AI chat tools may collect sensitive personal information—ensure platforms have secure practices and don’t share details you wouldn’t want public.
Practical, Real-Life Examples
Imagine a busy parent in Ballwin who uses ChatGPT to discuss stress and anxiety after a tough week. For some, the chatbot’s suggestions might help them reflect, journal, or try basic coping strategies—but it cannot recognize nuanced changes in mood, monitor safety, or offer the understanding that comes from face-to-face therapy.
Or picture a young adult in Kirkwood struggling with feelings of worthlessness and isolation. Initiating a chat with AI, they receive “advice” but no assurance that someone truly understands the breadth of their struggle. Reaching out to Marble Wellness or calling a local crisis line brings actual human care and a path toward hope.
How Marble Wellness Can Help
At Marble Wellness, therapists like Stephanie Korpal specialize in compassionate, solution-focused support tailored to families, teens, and adults throughout St. Louis and the surrounding counties. Whether you’re searching for ongoing counseling or facing an acute challenge, Marble Wellness offers:
- In-person and online therapy options
- Evidence-based therapeutic interventions, including individual, couples, & group support
- Strategies for parents, young adults, and families to build resilience together
- Safety planning and connection to community resources
Schedule a free consult to discover therapy options designed for your unique needs.
Take the Next Steps for Real Therapist Support
Mental health emergencies require skilled, human care. If you or a loved one is in crisis, please contact a professional immediately—AI is not a replacement. Utilize trusted resources like Marble Wellness or the national Suicide & Crisis Lifeline (988), and share this article with anyone who may benefit. Remember, compassion, connection, and expertise are always within reach.
Start Expert Therapy in the St. Louis Area
If you live in the St. Louis metro area and are ready to improve your mental health, our expert St. Louis therapists are here to help. Not only do we have a team of therapists in Ballwin, MO, but we have also recently expanded to serve the Lake St. Louis and Wentzville area! Reach out to our Client Care Coordinator today to discuss your therapy options, both in-person and via online therapy in Missouri.
Contact Us!
Learn About Our Group Offerings

Additional Counseling Services at Marble Wellness in St. Louis, MO
Marble Wellness Counseling services are designed to help set you on a path of living a more fulfilled, calm, and happy life. Our St. Louis area therapists have a variety of training backgrounds and areas of expertise. We have child and play therapists, therapists for teens, EMDR therapists, men’s mental health experts, couples therapists, and more! We specialize in anxiety, depression, grief, chronic illness, trauma & PTSD, life transitions, and maternal overwhelm. Our practice also specifically helps new moms with various postpartum concerns, moms in the thick of parenting, and moms with teens. We can also chat from wherever you are in the state with online therapy in Missouri. No matter where you are in your journey, we are here to help you thrive!


