News
A 60-year-old man ended up on psychiatric hold after accidentally poisoning himself by misunderstanding a ChatGPT response ...
"The consequences can be severe, including involuntary psychiatric holds, fractured relationships and in tragic cases, ...
After having “auditory and visual hallucinations,” the man tried to escape the hospital, forcing the staff to place him on ...
A husband and father was developing a philosophical belief system with ChatGPT that took over his life — and he found himself ...
Psychiatrist Dr. Sakata connects psychosis cases to AI interactions, citing 12 hospitalisations in 2025. He stresses the risk ...
A 60-year-old man was hospitalized with toxicity and severe psychiatric symptoms after asking ChatGPT for tips on how to ...
While most people can use chatbots without issue, experts say a small group of users may be especially vulnerable to ...
As AI chatbots become increasingly sophisticated and lifelike, a troubling phenomenon has emerged: reports of psychosis-like symptoms triggered by intense and prolonged interactions with ...
As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and ...
A 60-year-old man developed a rare medical condition after ChatGPT advised him on alternatives to table salt, according to a ...
A 60-year-old man who turned to ChatGPT for advice replaced salt from their diet and consumed a substance that gave them neuropsychotic illness called bromism.
As AI chatbots increasingly become part of our everyday lives, recognising the potential risks of obsessive engagement is ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results