Close Menu
Truth Republican
  • Home
  • News
  • Politics
  • Business
  • Guns & Gear
  • Healthy Tips
  • Prepping & Survival
  • Videos
Facebook X (Twitter) Instagram
Truth Republican
  • Home
  • News
  • Politics
  • Business
  • Guns & Gear
  • Healthy Tips
  • Prepping & Survival
  • Videos
Newsletter
Truth Republican
You are at:Home»Healthy Tips»ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
Healthy Tips

ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning

Buddy DoyleBy Buddy DoyleAugust 13, 2025No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp
ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
Share
Facebook Twitter LinkedIn Pinterest Email

NEWYou can now listen to Fox News articles!

A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital.

The 60-year-old man, who was looking to eliminate table salt from his diet for health reasons, used the large language model (LLM) to get suggestions for what to replace it with, according to a case study published this week in the Annals of Internal Medicine.

When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man made the replacement for a three-month period — although, the journal article noted, the recommendation was likely referring to it for other purposes, such as cleaning.

CHATGPT COULD BE SILENTLY REWIRING YOUR BRAIN AS EXPERTS URGE CAUTION FOR LONG-TERM USE

Sodium bromide is a chemical compound that resembles salt, but is toxic for human consumption. 

It was once used as an anticonvulsant and sedative, but today is primarily used for cleaning, manufacturing and agricultural purposes, according to the National Institutes of Health.

When the man arrived at the hospital, he reported experiencing fatigue, insomnia, poor coordination, facial acne, cherry angiomas (red bumps on the skin) and excessive thirst — all symptoms of bromism, a condition caused by long-term exposure to sodium bromide.

The man also showed signs of paranoia, the case study noted, as he claimed that his neighbor was trying to poison him.

ARTIFICIAL INTELLIGENCE DETECTS CANCER WITH 25% GREATER ACCURACY THAN DOCTORS IN UCLA STUDY

He was also found to have auditory and visual hallucinations, and was ultimately placed on a psychiatric hold after attempting to escape. 

The man was treated with intravenous fluids and electrolytes, and was also put on anti-psychotic medication. He was released from the hospital after three weeks of monitoring.

“This case also highlights how the use of artificial intelligence (AI) can potentially contribute to the development of preventable adverse health outcomes,” the researchers wrote in the case study.

“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense.”

“Unfortunately, we do not have access to his ChatGPT conversation log and we will never be able to know with certainty what exactly the output he received was, since individual responses are unique and build from previous inputs.”

It is “highly unlikely” that a human doctor would have mentioned sodium bromide when speaking with a patient seeking a substitute for sodium chloride, they noted.

NEW AI TOOL ANALYZES FACE PHOTOS TO PREDICT HEALTH OUTCOMES

“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results and ultimately fuel the spread of misinformation,” the researchers concluded.

Dr. Jacob Glanville, CEO of Centivax, a San Francisco biotechnology company, emphasized that people should not use ChatGPT as a substitute for a doctor.

Man pouring salt into pot

“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense when deciding what to ask these systems and whether to heed their recommendations,” Glanville, who was not involved in the case study, told Fox News Digital. 

“This is a classic example of the problem: The system essentially went, ‘You want a salt alternative? Sodium bromide is often listed as a replacement for sodium chloride in chemistry reactions, so therefore it’s the highest-scoring replacement here.’”

Dr. Harvey Castro, a board-certified emergency medicine physician and national speaker on artificial intelligence based in Dallas, confirmed that AI is a tool and not a doctor. 

Man spooning salt

“Large language models generate text by predicting the most statistically likely sequence of words, not by fact-checking,” he told Fox News Digital.

“ChatGPT’s bromide blunder shows why context is king in health advice,” Castro went on. “AI is not a replacement for professional medical judgment, aligning with OpenAI’s disclaimers.”

Castro also cautioned that there is a “regulation gap” when it comes to using LLMs to get medical information.

“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice.”

“FDA bans on bromide don’t extend to AI advice — global health AI oversight remains undefined,” he said.

There is also the risk that LLMs could have data bias and a lack of verification, leading to hallucinated information.

CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTER

“If training data includes outdated, rare or chemically focused references, the model may surface them in inappropriate contexts, such as bromide as a salt substitute,” Castro noted.

“Also, current LLMs don’t have built-in cross-checking against up-to-date medical databases unless explicitly integrated.”

OpenAI ChatGPT app on the App Store website

To prevent cases like this one, Castro called for more safeguards for LLMs, such as integrated medical knowledge bases, automated risk flags, contextual prompting and a combination of human and AI oversight.

The expert added, “With targeted safeguards, LLMs can evolve from risky generalists into safer, specialized tools; however, without regulation and oversight, rare cases like this will likely recur.”

For more health articles, visit www.foxnews.com/health

OpenAI, the San Francisco-based maker of ChatGPT, provided the following statement to Fox News Digital.

“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance.”

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleWWE official Charles Robinson reveals bat bite caused hospital visit: ‘Great way to start the morning’
Next Article GE Appliances invests $3B in US manufacturing operations

Related Articles

Stepdad punishes boy’s disrespect with pushups and squats in viral video

Stepdad punishes boy’s disrespect with pushups and squats in viral video

August 12, 2025
Melanoma can be deadly: What to know about the disease that killed Kelly Clarkson’s ex

Melanoma can be deadly: What to know about the disease that killed Kelly Clarkson’s ex

August 12, 2025
Study reveals why ‘super agers’ maintain ‘outstanding memory’ into their 80s

Study reveals why ‘super agers’ maintain ‘outstanding memory’ into their 80s

August 12, 2025
Colorectal cancer may cause these 4 hidden warning signs, experts say

Colorectal cancer may cause these 4 hidden warning signs, experts say

August 11, 2025
Woman’s snoring was symptom of rare form of cancer: ‘Don’t ignore it’

Woman’s snoring was symptom of rare form of cancer: ‘Don’t ignore it’

August 11, 2025
‘Missing link’ to Alzheimer’s disease found in study of human brain tissue

‘Missing link’ to Alzheimer’s disease found in study of human brain tissue

August 11, 2025
Vinegar face masks and barley water? Medieval ‘health hacks’ mirror today’s TikTok trends

Vinegar face masks and barley water? Medieval ‘health hacks’ mirror today’s TikTok trends

August 10, 2025
Heatstroke tragedies spark urgent nationwide warning about safety practices

Heatstroke tragedies spark urgent nationwide warning about safety practices

August 9, 2025
Ancient ‘Viking diet’ makes a comeback: Here’s what to know before you try it

Ancient ‘Viking diet’ makes a comeback: Here’s what to know before you try it

August 9, 2025
Don't Miss
Zelensky Could Concede Land?

Zelensky Could Concede Land?

House Democrat presses DOJ on Ghislaine Maxwell prison transfer, meeting with top official

House Democrat presses DOJ on Ghislaine Maxwell prison transfer, meeting with top official

Venezuelan migrants, progressive group sue Trump admin after Noem nixes Biden-era ‘protected status’

Venezuelan migrants, progressive group sue Trump admin after Noem nixes Biden-era ‘protected status’

Trump Sends National Guard to Liberate DC from Gangs, Maniacs & Homeless

Trump Sends National Guard to Liberate DC from Gangs, Maniacs & Homeless

Latest News
‘Deadliest Catch’ star Sig Hansen confronts mortality after life-threatening health scares at sea

‘Deadliest Catch’ star Sig Hansen confronts mortality after life-threatening health scares at sea

August 13, 2025
GE Appliances invests B in US manufacturing operations

GE Appliances invests $3B in US manufacturing operations

August 13, 2025
ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning

ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning

August 13, 2025
WWE official Charles Robinson reveals bat bite caused hospital visit: ‘Great way to start the morning’

WWE official Charles Robinson reveals bat bite caused hospital visit: ‘Great way to start the morning’

August 13, 2025
Pentagon drawing up quick reaction force of National Guard ready to quell civil unrest at any moment: report

Pentagon drawing up quick reaction force of National Guard ready to quell civil unrest at any moment: report

August 13, 2025
Copyright © 2025. Truth Republican. All rights reserved.
  • Privacy Policy
  • Terms of use
  • Contact

Type above and press Enter to search. Press Esc to cancel.