Saturday, August 16, 2025

AI Advice Gone Wrong: 60-Year-Old Man Hospitalized Following ChatGPT Recommendations

A 60-year-old man was hospitalized for three weeks after substituting table salt with sodium bromide, influenced by ChatGPT’s advice. Detailed in a recent Annals of Internal Medicine report from University of Washington physicians, the man arrived displaying paranoia, believing his neighbor poisoned him. Lab tests confirmed high bromide levels, leading to hallucinations and a subsequent psychiatric hold. He initiated this “personal experiment” for health reasons, after engaging with ChatGPT for three months. Although OpenAI states their chatbot isn’t intended for medical advice, it suggested bromide could replace chloride without sufficient health warnings. Historically, bromide toxicity was common in the early 20th century due to its presence in medications, but it is now mainly used in veterinary medicine. This rare syndrome has resurfaced with the increased online availability of bromide products, highlighting the importance of seeking professional medical guidance before making health-related changes.

Source link

Share

Read more

Local News