A 60-year-old man from New York was hospitalized for three weeks after following incorrect medical advice provided by ChatGPT. Experts have cautioned against relying on AI for medical advice, emphasizing that it’s not yet developed enough to replace doctors. The man inquired with ChatGPT about how to eliminate salt (sodium chloride) from his diet. The AI suggested using sodium bromide as a substitute. Despite warnings, he purchased sodium bromide online and used it in place of salt for three months without consulting a doctor, leading to serious health consequences. He experienced severe symptoms including fear, confusion, excessive thirst, and mental disorientation. Medical tests confirmed bromide toxicity. Doctors worked to restore his electrolyte balance, and he was eventually discharged after his sodium and chloride levels normalized.
Subscribe to Updates
Get the latest creative news from FooBar about art, design and business.









