Journal warns against using ChatGPT for health after a man develops rare condition

US medical experts warn against using ChatGPT for health advice after a 60-year-old man developed bromism following the chatbot’s dietary guidance.
The patient developed bromide toxicity after consulting the AI app about eliminating table salt from his diet and subsequently taking sodium bromide for three months.
Bromism, a “well-recognised” syndrome in the early 20th century, was thought to have contributed to almost one in 10 psychiatric admissions at the time.
Researchers from the University of Washington in Seattle documented the case, noting the patient had read about the negative effects of sodium chloride before seeking ChatGPT’s advice about eliminating chloride from his diet.
The man told doctors he began taking sodium bromide despite reading it could be “swapped with bromide, though likely for other purposes, such as cleaning.”
Sodium bromide was used as a sedative in the early 20th century.
The authors said the case highlighted “how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes.”
As they could not access the patient’s ChatGPT conversation log, the researchers said they could not determine the exact advice given.
“When they tested ChatGPT themselves about chloride replacements, the response also included bromide, gave no specific health warning and did not ask why the information was being sought — “as we presume a medical professional would do,” they wrote.
They warned that ChatGPT and other AI apps could “generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.”
OpenAI announced an upgrade last week, claiming one of the chatbot’s biggest strengths was in health.
It said ChatGPT – now powered by the GPT-5 model – would be better at answering health-related questions and more proactively flag “potential concerns” such as serious physical or mental illness.
However, OpenAI emphasised the chatbot is not a replacement for professional help.
Guidelines state it is not “intended for use in the diagnosis or treatment of any health condition.”
The journal article, published last week before the GPT-5 launch, said the patient appeared to have used an earlier version of ChatGPT.
While acknowledging that AI could be a bridge between scientists and the public, the authors said the technology also carried the risk of promoting “decontextualised information” and that it was highly unlikely a medical professional would have suggested sodium bromide when a patient asked for a replacement for table salt.
As a result, they said, doctors would need to consider the use of AI when checking where patients obtained their information.
The man presented at hospital claiming his neighbour might be poisoning him. He also said he had multiple dietary restrictions.
Despite being thirsty, he was noted as being paranoid about the water he was offered.
He tried to leave the hospital within 24 hours of admission and, after being sectioned, was treated for psychosis.
Once the patient stabilised, he reported other symptoms that indicated bromism, such as facial acne, excessive thirst and insomnia.





