A US medical journal has cautioned against using ChatGPT for health information after a 60-year-old man developed bromism—bromide toxicity—following advice from the chatbot. The Annals of Internal Medicine detailed the case, highlighting that the patient, seeking to eliminate table salt, mistakenly used sodium bromide instead, influenced by ChatGPT’s suggestions. Though sodium bromide was a common sedative in the early 20th century, the authors emphasized the potential dangers of AI-generated health advice, noting it can facilitate misinformation and scientific inaccuracies. They stressed that ChatGPT lacks critical discussion abilities and fails to pose relevant questions to users, unlike healthcare professionals. Despite the recent upgrade to ChatGPT’s capabilities in health queries, it remains clear that AI should not replace professional medical advice. The case underscores the necessity for healthcare providers to inquire about patients’ information sources to mitigate preventable adverse outcomes resulting from AI usage.

Share
Read more