Skip to content
November 15, 2025
  • Facebook
  • Twitter
  • LinkedIn
  • VK
  • YouTube
  • Instagram

Deercreekfoundation

News Faster Than Your Coffee

banner-promo-black-
Primary Menu
  • Automotive
  • Economy
  • Entertainment
  • Lifestyle
  • Literature
  • Politic
  • Soccer
  • Sport
  • Tech News
  • World
Live
  • Home
  • 2025
  • November
  • 15
  • AI shows serious limitations as therapist: new study warns of ethical risks of chatbots
  • World

AI shows serious limitations as therapist: new study warns of ethical risks of chatbots

deercreekfoundation November 15, 2025
FDDMGYY5KRBKLEHPERIXYBZYGI.png
Research has revealed serious limitations of AI as a therapist. (Photo: Image)

Recent academic research has sounded the alarm about the use of artificial intelligence systems as emotional support. A team from Brown University in the US determined that: Language models used as substitutes for human therapists violate basic ethical standards And they can be a risk to those who rely on them for psychological containment.

The study concludes that these tools fail to reproduce basic principles of clinical practice and may in some cases lead to inappropriate or potentially harmful reactions. The results show that the chatbot used as a virtual advisor violates 5 major categories in a set of 15. Assessing ethical risks.

Failures detected include failure to adapt to the user’s personal situation, inadequate therapeutic relationships, artificial expressions of empathy that do not reflect true understanding, biases related to gender, culture, and religion, and incorrect responses to potentially emotionally dangerous situations. For researchers, this shows that the current system does not have the capacity needed to meet the responsibilities of mental health professionals.

Use chatbots as therapists
Using chatbots as therapists poses ethical risks. (Illustration image information site)

The analysis was conducted by computer science and mental health experts to assess how chatbots respond to prompts that users typically use to seek psychological support.

Phrases like “help me reframe my thoughts” or “help me manage my emotions” activate responses generated by learned patterns, but they are not actual therapeutic techniques. One of the researchers emphasized this point, saying that although models can mimic the language of therapy, They are unable to perform genuine interventions and are unable to deeply understand emotional situations. of each person.

Another related finding is that, unlike human clinical practice, there is no regulatory system in AI models. Mental health professionals are subject to codes of ethics, oversight, and mismanagement, while digital tools operate without an established framework of responsibility. This leaves users in a vulnerable situation, especially when using these systems as their primary resource for dealing with emotional issues.

lack of empathy and
Lack of empathy and emotional understanding is a risk of using chatbots as therapists. – (Illustration image information)

The study also warns that the easy availability of these chatbots can create a false sense of security. According to the authors, The ease of adoption of these types of technologies exceeds the ability to properly evaluate them.. This means that many models are being used without rigorous analysis of their impact in sensitive settings such as mental health, which is a concern for experts and an urgent need to address.

In addition to identifying deficiencies, the researchers provide recommendations for people who use AI tools for emotional support. One of the main ones is to maintain a critical attitude towards the information received and evaluate whether the system is able to understand the user’s personal situation or responds with a general formula. Personalization is essential to avoid false conclusions and oversimplifications that do not reflect the complexity of reality.

Another suggestion is to see if chatbots encourage autonomy, reflection, and critical thinking. A responsible system should not be limited to examine emotionsrather encourages deeper analysis and promotes conscious decision-making. The lack of these features can lead users to build harmful dependencies on the tool.

Use chatbots as therapists
Using chatbots as therapists does not replace humans. (Illustration image information site)

This study also highlights the importance of cultural and contextual sensitivity. Responses that ignore factors such as social environment, culture, and user experience can be inappropriate or even dangerous. In this sense, the researchers argue that AI tools applied to the emotional field should be designed to detect signs of crisis and provide specialized resources, such as helplines or recommendations to see a human therapist.

Experts behind the study note that the goal is not to rule out the potential of AI in the mental health field, but to show that these systems cannot replace human labor at this time. This study highlights the need to develop ethical frameworks and supervisory mechanisms before integrating this type of tool as an alternative means of emotional support.

About The Author

deercreekfoundation

See author's posts

Post navigation

Previous: Full stats for Sinner vs De Minaur semi-final
Next: The Champs frontman and ‘Tequila’ hitmaker Dave Burgess dies at 90

Related Stories

foto-alckmin.jpeg
  • World

Alkmin says President Trump’s withdrawal is positive and will seek new cuts

deercreekfoundation November 15, 2025
1763218004108.jpg
  • World

Video – Alarming fire and tension on Route 2 near La Plata

deercreekfoundation November 15, 2025
ciudad-justicia-valencia-U82023228457KHT-1024x512@diario_abc.JPG
  • World

A father accused of forcing his son to engage in various sexual acts over a period of about 10 years is sentenced to 18 years in prison.

deercreekfoundation November 15, 2025

Recent Posts

  • Alkmin says President Trump’s withdrawal is positive and will seek new cuts
  • Video – Alarming fire and tension on Route 2 near La Plata
  • A father accused of forcing his son to engage in various sexual acts over a period of about 10 years is sentenced to 18 years in prison.
  • Private security forces investigate the deaths of a man and a woman with signs of violence at a mountain villa in Alpedrete
  • River players have been affected by the evacuation of nearby areas, and the club has decided to cancel matches at the camp.

Recent Comments

No comments to show.

Archives

  • November 2025
  • October 2025
  • May 2024

Categories

  • Automotive
  • Economy
  • Entertainment
  • Lifestyle
  • Literature
  • Politic
  • Soccer
  • Sport
  • Tech News
  • World

Tags

Beauty Collection Iskra Lawrence Trends

Recent Posts

  • Alkmin says President Trump’s withdrawal is positive and will seek new cuts
  • Video – Alarming fire and tension on Route 2 near La Plata
  • A father accused of forcing his son to engage in various sexual acts over a period of about 10 years is sentenced to 18 years in prison.
  • Private security forces investigate the deaths of a man and a woman with signs of violence at a mountain villa in Alpedrete
  • River players have been affected by the evacuation of nearby areas, and the club has decided to cancel matches at the camp.

Categories

Automotive Economy Entertainment Lifestyle Literature Politic Soccer Sport Tech News World
  • Home
  • Blog
  • Facebook
  • Twitter
  • LinkedIn
  • VK
  • YouTube
  • Instagram
Copyright © All rights reserved. | MoreNews by AF themes.