News
With more people turning to artificial intelligence chatbots for emotional support, mental health experts are sounding the ...
A psychiatrist says he's not against clients using ChatGPT. But it can "supercharge" people's vulnerabilities, leading to "AI ...
In the recent case, one patient that was allegedly following the generative AI’s nutritional suggestion was placed in ...
A 60-year-old man ended up on involuntary psychiatric hold after accidentally poisoning himself by misunderstanding a ChatGPT ...
Mental health experts are continuing to sound alarm bells about AI psychosis. On Monday, University of California, San ...
The consequences can be severe, including involuntary psychiatric holds, fractured relationships and in tragic cases, ...
Psychiatrist Dr. Sakata connects psychosis cases to AI interactions, citing 12 hospitalisations in 2025. He stresses the risk ...
While most people can use chatbots without issue, experts say a small group of users may be especially vulnerable to ...
As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and ...
Avid chatbot users are coming forward with stories about how, after a period of intense use, they developed psychosis. The altered mental state, in which people lose touch with reality, often includes ...
A husband and father was developing a philosophical belief system with ChatGPT that took over his life — and he found himself in an AI spiral.
A 60-year-old man who turned to ChatGPT for advice replaced salt from their diet and consumed a substance that gave them neuropsychotic illness called bromism.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results