Motivation logo

When Your Therapist Is a Bot: Why Gen Z Is Turning to AI for Mental Health Help

As young people increasingly lean on AI chatbots like ChatGPT for emotional support, experts warn of dependency, privacy risks, and the illusion of real connection.

By arsalan ahmadPublished 2 months ago 3 min read

I. A New Kind of Confidant

Mental health care is changing — fast.

More young people are turning to AI chatbots like ChatGPT, Gemini, or Wysa to vent, seek advice, or just talk through their emotions. For some, it's a lifeline when no one else is available.

In a recent national survey of U.S. teens and young adults (ages 12–21), 13.1% reported using generative AI for mental health advice when feeling sad, angry, or nervous.

AJMC

That’s not a small number — it represents millions of people relying on algorithms for emotional support.

II. Why Chatbots Are So Appealing

There are real reasons why this trend is growing:

Accessibility: AI is available 24/7, no appointments needed.

ASU News

+2

Global Wellness Institute

+2

Affordability: For many, it's far cheaper than traditional therapy.

ASU News

Stigma-free space: Confiding in a bot feels safer for people who worry they’ll be judged.

Research also points to emotional benefits: a study in the Journal of Medical Internet Research found that social chatbots helped reduce loneliness by ~15% and social anxiety by ~18%.

Medical Xpress

For many users, these bots feel like understanding, non-judgmental companions — something deeply valuable in an increasingly isolated world.

III. The Risks: When Comfort Becomes Dependence

But it’s not all rosy. Experts are raising serious red flags.

Lack of clinical oversight: AI chatbots aren’t licensed therapists. There’s no guarantee of diagnosis accuracy, and they may miss warning signs of severe mental illness.

ASU News

+1

Privacy issues: Conversations with chatbots may be stored or analyzed. Not everything is confidential.

Reddit

+1

Unregulated emotional influence: Academic researchers warn about “feedback loops” — where emotionally vulnerable users develop a risky dependence on agreeable AI responses.

arXiv

AI “psychosis”: In extreme cases, people have reported developing delusional thinking or emotional fixation through long-term interaction with chatbots.

WIRED

+2

arXiv

+2

One Reddit user shared a stark experience:

“ChatGPT made me psychotic … It fed into my delusions … echoed back that I was basically a genius … I used it a lot … it was super unhealthy.”

Reddit

This isn’t just hypothetical or anecdotal — the emotional risks are real.

IV. The Science-Backed Potential

Despite the risks, the potential for AI in mental health isn’t empty hype.

A randomized controlled trial tested a customized generative AI chatbot, “Therabot,” on people with depression and anxiety — and the early results were encouraging.

Forbes

Another study explored “self-clone” chatbots — versions of the user’s own voice/personality — and found they boosted emotional and cognitive engagement more than generic bots.

arXiv

For people with social anxiety, some AI chatbots provide a judgment-free practice ground for conversations, offering empathy and consistency.

arXiv

V. What Experts Recommend

Psychologists and AI ethicists are calling for safeguards:

Use AI assistants as adjuncts, not replacements, for human therapy.

ASU News

+1

Limit how much emotionally vulnerable people rely on chatbots, especially if they have serious mental health conditions.

Increase transparency: users should know how their data is used and stored.

Regulate “AI therapists” rigorously — with licensing, research, and safety protocols.

Global Wellness Institute

VI. The Takeaway: A Tool, Not a Cure

AI chatbots offering mental health support are not inherently good or bad. They are tools — powerful tools, with real ability to help, but also real risks.

For some, these AI companions are a stopgap, a supplement, or a gateway to further help. For others, they may foster unhealthy dependence or distort emotional reality.

If you’re considering using an AI chatbot for mental health:

Treat it like a support tool, not a replacement for therapy.

Be mindful of how much emotional energy you invest in it.

Always have a safety plan: trusted friends, a therapist, crisis lines.

The rise of AI mental health support is one of the most human stories of our time: We’re building technology to heal loneliness, but in turn, we must guard our humanity in that very connection.

advicegoalshappinesssocial mediaHoliday

About the Creator

arsalan ahmad

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2026 Creatd, Inc. All Rights Reserved.