ChatGPT Diet Advice Sends Man to Hospital
Man Falls Ill After Following ChatGPT’s Harmful Diet Advice
A man who sought to reduce his sodium intake has fallen seriously ill after following ChatGPT’s dietary suggestion to replace table salt (sodium chloride) with sodium bromide — a compound once used in early 20th-century medicines but now known to be harmful in high doses.
Without consulting a doctor, the man used sodium bromide in his cooking for three months. He later developed alarming symptoms, including hallucinations, fear, excessive thirst, and skin lesions. Doctors diagnosed him with bromism — a rare condition caused by excessive bromide in the body.
The patient, who had no prior health issues, deteriorated to the point of neurosis and required emergency psychiatric care. He underwent intensive treatment with fluids and electrolytes during a three-week hospital stay and has since recovered.
Medical experts have stressed that while AI tools like ChatGPT can be helpful for general information, they cannot replace professional medical guidance. OpenAI, the developer of ChatGPT, clearly states in its Terms of Use that the service is not intended for diagnosing or treating medical conditions.
This case has reignited debate over the ethical limits of AI in healthcare.
Separately, OpenAI CEO Sam Altman recently warned against relying on ChatGPT for emotional support or counselling, noting that such conversations are not legally protected like those with licensed therapists, doctors, or lawyers.

Mutib Khalid is a skilled content writer and digital marketer with a knack for crafting compelling narratives and optimizing digital strategies. Excel in creating engaging content that drives results and enhances online presence. Passionate about blending creativity with data-driven approaches, Mutib Khalid helps brands connect with their audience and achieve their goals.

