Senior Staff Writer Anna Mammarelli.
Senior Staff Writer Anna Mammarelli.
Home » News » National News » Florida » Virtual validation: The rise of emotional dependency on AI counseling
Florida

Virtual validation: The rise of emotional dependency on AI counseling

Excessive use of artificial intelligence (AI) for psychiatric services greatly impedes emotional regulation capabilities.

Video Thumbnail

According to Psychology Today, 22% of American adults use AI for therapeutic purposes. AI-generated therapy services are appealing because they are highly accessible, affordable, and require less vulnerability. Common formats include conversational chatbots that conduct real-time conversations, as well as skill-building, self-directed therapeutic exercises, and mood tracking apps. 

Nevertheless, the appeals carry a cost. Non-human therapy reduces human error, such as issues with billing or notetaking. However, the lack of human connection may result in false reassurance or delays in receiving sufficient help. Furthermore, without a human to audit credibility and accuracy, misinformation and unsafe advice are possible.

The National Eating Disorders Association (NEDA) disabled its chatbot, Tessa, after it recommended intense calorie restriction as a weight loss strategy to users with eating disorders.

Data collection is necessary to accommodate users’ needs and offer personalized solutions, yet it also presents privacy risks. AI experts advise that user data is not guaranteed to be properly stored or protected, and to avoid providing AI services with personal or sensitive information.

Ironically, this limits the expansiveness of AI’s therapeutic capabilities because this information is required to form a comprehensive mental health evaluation.

In 2023, BetterHelp, an online therapy platform, sent consumers’ sensitive mental health data to Facebook and Snapchat for advertising purposes. As a result, the company paid $7.8 million in settlement charges due to infringing on this data’s intended privacy.

“[The service] is free and can be a helpful alternative for people who may not be able to access a therapist. However, it can be risky because AI may show ‘sycophancy,’ meaning it tends to agree with users even when they’re wrong, and experts warn it focuses more on engagement than long-term well-being. While AI chatbots can create a therapeutic-like space, especially for men less likely to seek therapy, strong safeguards are essential to prevent these issues from continuing to occur,” junior political science major Nickolas Proden said to the FSView.

Advanced services are not required to receive acute personalization and history; Baseline models collect user data with each entry so that responses deeply resonate upon asking for advice. 

A 2025 study conducted by Fan Yang and Atsushi Oshio observed significant attachment anxiety towards AI due to participants’ need for emotional reassurance and a fear of inadequate responses. This dependency continues to strengthen as AI capabilities become wiser and more powerful.

OpenAI estimates that roughly 0.15% of weekly active users exhibit signs of emotional attachment. To combat the potential harm of this dependency, OpenAI collaborated with mental-health experts to enforce updated regulations, including safety tooling, expanded guidance, detection of mental health crises, and encouragement of human connection rather than artificial ones when necessary.

“I think using therapy in AI can be beneficial, but not as a replacement for therapy. Learning about different symptoms someone is experiencing, or possible treatments, can be useful with AI, but AI isn’t a human and can’t comprehend the human experience or emotions,” sophomore behavioral neuroscience major Abby Pyle said to the FSView.

AI is helpful for convenient reassurance and grounding in times of stress or overthinking. However, relying on instant feedback and shielding oneself from human intimacy threatens mental health and well-being.

FSU students are encouraged to access university-issued counseling services or look into additional resources when emotional needs exceed AI’s finite threshold. AI thrives when seen as a supplement to therapy, not the source.

Anna Mammarelli is a Media and Communications major at Florida State University and a Staff Writer for the Views section of the FSView & Florida Flambeau, the student-run, independent online news service for the FSU community. Email our staff at  contact@fsview.com. 

This article originally appeared on FSU News: Virtual validation: The rise of emotional dependency on AI counseling

Reporting by Anna Mammarelli, Senior Staff Writer, FSView / FSU News

USA TODAY Network via Reuters Connect

Image

Related posts

Leave a Comment