Following ChatGPT’s advice for dieting may not always be the best option—especially when it suggests consuming a chemical compound toxic to humans. This was the case reported by the medical journal Annals of Internal Medicine, involving a 60-year-old man who turned to artificial intelligence to help eliminate salt from his diet.
When consulted, the US-based language model recommended that he replace sodium chloride—the chemical compound in table salt—with sodium bromide, a substance similar to salt but mainly used for cleaning, manufacturing, and agricultural purposes, and toxic for human consumption.
The man was later hospitalised with symptoms including fatigue, insomnia, lack of coordination, facial acne, red skin bumps, excessive thirst, hallucinations, and paranoia. At the hospital, he claimed his neighbour was trying to poison him.
In reality, he had been unknowingly poisoning himself by using sodium bromide in his meals for three months. His condition worsened to the point that he required admission to a psychiatric unit after attempting to escape.
He was treated with intravenous fluids, electrolytes, and anti-psychotic medication. After three weeks of monitoring, he was discharged, having narrowly avoided what could have been a fatal case of poisoning.
The medical journal highlighted this case to warn about the potential health risks of relying on artificial intelligence for medical or dietary guidance, especially when such information is not verified by a professional.