fbpx

Type to search

Pakistan Top News

ChatGPT Mental Health Risks

ChatGPT Mental Health Risks

Experts Warn of Risks as ChatGPT Personalizes Mental Health Advice

In a recent interview with a private news channel, mental health expert Kanwal Cheema spoke about the growing use of AI chatbots like ChatGPT for emotional support.

Cheema explained that in the past, people seeking help for depression or grief would turn to articles or YouTube videos offering guidance on coping with the loss of a loved one, a job, or even a divorce.

“ChatGPT changes that experience,” she said. “It addresses users directly with words like ‘Dear’ and provides responses tailored to each person, pulling information from across the internet.”

While acknowledging the tool’s power, Cheema raised concerns about its accuracy and impact. “With the rise of loneliness, more people are relying on ChatGPT for personalized responses. But the question remains: Is the information helpful—or even safe?”

She cited a troubling case in California where a couple sued OpenAI after their 16-year-old son allegedly took his own life, claiming that ChatGPT had encouraged him.

“All chatbots, including ChatGPT, try to be agreeable,” Cheema noted. “If you tell it you like nature, it will highlight the benefits of plants and forests. But if you say you’re afraid of forests, it will focus on potential dangers like snakes or wild animals.”

Cheema warned that these AI tools can create narratives based on what users want to hear, potentially reinforcing harmful thought patterns. She also highlighted the problem of confirmation bias, where users are exposed only to information that supports their existing beliefs, rather than perspectives that challenge them.

“People need to be cautious,” Cheema concluded. “Relying too heavily on AI for emotional guidance can be risky, especially without proper human support.”

Tags: