News

A 60-year-old man ended up on involuntary psychiatric hold after accidentally poisoning himself by misunderstanding a ChatGPT ...
Mental health experts are continuing to sound alarm bells about AI psychosis. On Monday, University of California, San ...
"The consequences can be severe, including involuntary psychiatric holds, fractured relationships and in tragic cases, ...
As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and ...
After the escape attempt, the man was given an involuntary psychiatric hold and an anti-psychosis drug. He was administered ...
A 60-year-old man gave himself an uncommon psychiatric disorder after asking ChatGPT for diet advice in a case published ...
Psychiatrist Dr. Sakata connects psychosis cases to AI interactions, citing 12 hospitalisations in 2025. He stresses the risk ...
After having “auditory and visual hallucinations,” the man tried to escape the hospital, forcing the staff to place him on ...
A husband and father was developing a philosophical belief system with ChatGPT that took over his life — and he found himself ...
While most people can use chatbots without issue, experts say a small group of users may be especially vulnerable to ...
As AI chatbots become increasingly sophisticated and lifelike, a troubling phenomenon has emerged: reports of psychosis-like symptoms triggered by intense and prolonged interactions with ...
A 60-year-old man who turned to ChatGPT for advice replaced salt from their diet and consumed a substance that gave them neuropsychotic illness called bromism.