Saturday, April 11, 2026 Live Desk
Zwely News logo

Your therapist should be asking how you're using AI, some experts say

Just like sleep or screen time, AI use could be affecting your mental health-and clinicians might need to start treating it that way

ZN

Author

Zwely News Staff

Shared Newsroom

April 10, 2026 10:19 AM 3 min read
Your therapist should be asking how you're using AI, some experts say

At a glance

What matters most

  • Experts say therapists should ask patients about their AI use, just as they would about sleep, alcohol, or social media
  • AI chatbots are increasingly used for emotional support, advice, and even therapy-like conversations, sometimes without professional oversight
  • Some users form strong attachments to AI companions, which could affect real-world relationships and mental health treatment
  • The conversation isn't about fearmongering-it's about understanding a new part of modern life that's already shaping how people think and feel

Across the spectrum

What people are saying

A quick look at how the same story is being framed from different angles.

On the Left

Mental health providers have a duty to understand all the tools shaping their patients' emotional worlds. With AI increasingly used for emotional support-especially by young people and marginalized communities who face barriers to care-ignoring it risks missing key parts of a person's mental health picture. Asking about AI use is part of providing equitable, informed care.

In the Center

AI is becoming a common part of daily life, and like any habit that affects mood or behavior, it's worth discussing in therapy when relevant. The goal isn't to overmedicalize AI use, but to treat it like other behavioral factors-screen time, internet use, or self-help reading-that clinicians already consider in context.

On the Right

Therapists should focus on proven treatments and personal responsibility, not chase every new tech trend. While AI use might matter in extreme cases, making it a standard question could medicalize normal behavior and expand therapy into areas better handled by parenting, discipline, or personal discernment.

Full coverage

What you should know

When you sit down with your therapist, you might expect questions about your sleep, your relationships, or how you've been coping with stress. But there's a new topic some experts think should join that list: your use of artificial intelligence.

In a recent paper published in JAMA Psychiatry, mental health researchers argue that clinicians should routinely ask patients how-and how much-they're interacting with AI chatbots like ChatGPT, Replika, or other AI companions. The idea isn't to alarm anyone, but to recognize that these tools are becoming part of people's emotional landscapes in meaningful ways.

More people are turning to AI for conversation, comfort, and even crisis support. Some use chatbots to rehearse difficult talks. Others lean on AI companions during loneliness or depression. A few have even reported breaking down in tears when an AI relationship ended. These interactions aren't always harmless venting-they can shape beliefs, reinforce habits, or delay seeking real help.

That's why asking about AI use could be as routine as asking about screen time or alcohol consumption. Just as therapists once had to learn to talk about social media, they may now need to understand how patients relate to machines that mimic empathy.

Some worry that bringing AI into therapy could pathologize normal behavior. But the goal isn't to judge-it's to understand. If a patient is relying on an AI for daily emotional support, that's useful context. It might explain why they're hesitant to open up in therapy, or why certain topics feel off-limits.

There's also a practical side: AI can give bad advice. It might encourage risky behaviors, misdiagnose symptoms, or echo harmful beliefs. A therapist who knows a patient trusts an AI voice could help them think critically about what they're hearing.

This isn't about replacing human care with machines. It's about acknowledging that AI is already woven into how people manage their inner lives-and that therapists, to be effective, may need to talk about it too.

About this author

Zwely News Staff compiles multi-source reporting into concise, viewpoint-aware coverage for readers who want context without noise.

Source Notes

Center NPR Apr 10, 9:30 AM

'How are you using AI?' Your therapist should ask you that question, experts argue

A paper in JAMA Psychiatry says mental health providers should ask if patients are using artificial intelligence chatbots, just as they would ask patients about sleep habits and substance use.

Left Vox Apr 10, 6:30 AM

Why you should keep your therapy session even when you don’t have anything to talk about

Most weeks when I meet with my therapist, she triages some aspect of my life that is actively bursting at the seams — my inability to rationally talk about politics, for example, or the state of my personal finances. But, every so often, li...

Previous story

Bissell is pulling over a million steam cleaners off the market after reports of burns

Next story

Netflix's new shark thriller Thrash is more fun than scary

Related Articles

More in U.S.