Close Menu
Truth Republican
  • Home
  • News
  • Politics
  • Business
  • Guns & Gear
  • Healthy Tips
  • Prepping & Survival
  • Videos
Facebook X (Twitter) Instagram
Truth Republican
  • Home
  • News
  • Politics
  • Business
  • Guns & Gear
  • Healthy Tips
  • Prepping & Survival
  • Videos
Newsletter
Truth Republican
You are at:Home»Healthy Tips»ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
Healthy Tips

ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning

Buddy DoyleBy Buddy DoyleAugust 13, 2025No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp
ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
Share
Facebook Twitter LinkedIn Pinterest Email

NEWYou can now listen to Fox News articles!

A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital.

The 60-year-old man, who was looking to eliminate table salt from his diet for health reasons, used the large language model (LLM) to get suggestions for what to replace it with, according to a case study published this week in the Annals of Internal Medicine.

When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man made the replacement for a three-month period — although, the journal article noted, the recommendation was likely referring to it for other purposes, such as cleaning.

CHATGPT COULD BE SILENTLY REWIRING YOUR BRAIN AS EXPERTS URGE CAUTION FOR LONG-TERM USE

Sodium bromide is a chemical compound that resembles salt, but is toxic for human consumption. 

It was once used as an anticonvulsant and sedative, but today is primarily used for cleaning, manufacturing and agricultural purposes, according to the National Institutes of Health.

When the man arrived at the hospital, he reported experiencing fatigue, insomnia, poor coordination, facial acne, cherry angiomas (red bumps on the skin) and excessive thirst — all symptoms of bromism, a condition caused by long-term exposure to sodium bromide.

The man also showed signs of paranoia, the case study noted, as he claimed that his neighbor was trying to poison him.

ARTIFICIAL INTELLIGENCE DETECTS CANCER WITH 25% GREATER ACCURACY THAN DOCTORS IN UCLA STUDY

He was also found to have auditory and visual hallucinations, and was ultimately placed on a psychiatric hold after attempting to escape. 

The man was treated with intravenous fluids and electrolytes, and was also put on anti-psychotic medication. He was released from the hospital after three weeks of monitoring.

“This case also highlights how the use of artificial intelligence (AI) can potentially contribute to the development of preventable adverse health outcomes,” the researchers wrote in the case study.

“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense.”

“Unfortunately, we do not have access to his ChatGPT conversation log and we will never be able to know with certainty what exactly the output he received was, since individual responses are unique and build from previous inputs.”

It is “highly unlikely” that a human doctor would have mentioned sodium bromide when speaking with a patient seeking a substitute for sodium chloride, they noted.

NEW AI TOOL ANALYZES FACE PHOTOS TO PREDICT HEALTH OUTCOMES

“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results and ultimately fuel the spread of misinformation,” the researchers concluded.

Dr. Jacob Glanville, CEO of Centivax, a San Francisco biotechnology company, emphasized that people should not use ChatGPT as a substitute for a doctor.

Man pouring salt into pot

“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense when deciding what to ask these systems and whether to heed their recommendations,” Glanville, who was not involved in the case study, told Fox News Digital. 

“This is a classic example of the problem: The system essentially went, ‘You want a salt alternative? Sodium bromide is often listed as a replacement for sodium chloride in chemistry reactions, so therefore it’s the highest-scoring replacement here.’”

Dr. Harvey Castro, a board-certified emergency medicine physician and national speaker on artificial intelligence based in Dallas, confirmed that AI is a tool and not a doctor. 

Man spooning salt

“Large language models generate text by predicting the most statistically likely sequence of words, not by fact-checking,” he told Fox News Digital.

“ChatGPT’s bromide blunder shows why context is king in health advice,” Castro went on. “AI is not a replacement for professional medical judgment, aligning with OpenAI’s disclaimers.”

Castro also cautioned that there is a “regulation gap” when it comes to using LLMs to get medical information.

“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice.”

“FDA bans on bromide don’t extend to AI advice — global health AI oversight remains undefined,” he said.

There is also the risk that LLMs could have data bias and a lack of verification, leading to hallucinated information.

CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTER

“If training data includes outdated, rare or chemically focused references, the model may surface them in inappropriate contexts, such as bromide as a salt substitute,” Castro noted.

“Also, current LLMs don’t have built-in cross-checking against up-to-date medical databases unless explicitly integrated.”

OpenAI ChatGPT app on the App Store website

To prevent cases like this one, Castro called for more safeguards for LLMs, such as integrated medical knowledge bases, automated risk flags, contextual prompting and a combination of human and AI oversight.

The expert added, “With targeted safeguards, LLMs can evolve from risky generalists into safer, specialized tools; however, without regulation and oversight, rare cases like this will likely recur.”

For more health articles, visit www.foxnews.com/health

OpenAI, the San Francisco-based maker of ChatGPT, provided the following statement to Fox News Digital.

“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance.”

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleWWE official Charles Robinson reveals bat bite caused hospital visit: ‘Great way to start the morning’
Next Article GE Appliances invests $3B in US manufacturing operations

Related Articles

Fox News’ Eric Shawn reveals cancer and respiratory illness from 9/11 toxic dust

Fox News’ Eric Shawn reveals cancer and respiratory illness from 9/11 toxic dust

September 11, 2025
Common prescription drugs linked to surge in fall-related death rates among seniors

Common prescription drugs linked to surge in fall-related death rates among seniors

September 11, 2025
Diabetes study reveals diagnosis gap affecting millions of people

Diabetes study reveals diagnosis gap affecting millions of people

September 10, 2025
Popular sweetener could make cancer treatment less effective, study finds

Popular sweetener could make cancer treatment less effective, study finds

September 10, 2025
Why microdosing Ozempic could become as common as taking a multivitamin

Why microdosing Ozempic could become as common as taking a multivitamin

September 10, 2025
Heart surgeon reveals what to eat (and not eat) for optimal cardiac health

Heart surgeon reveals what to eat (and not eat) for optimal cardiac health

September 9, 2025
Cancer treatment could be less effective if patients consume popular sweetener

Cancer treatment could be less effective if patients consume popular sweetener

September 8, 2025
Psychedelic drug popular in 1960s could ease anxiety as doctors share warnings

Psychedelic drug popular in 1960s could ease anxiety as doctors share warnings

September 8, 2025
‘Next Ozempic’ aims to deliver 30% weight loss with fewer side effects

‘Next Ozempic’ aims to deliver 30% weight loss with fewer side effects

September 8, 2025
Don't Miss
WATCH: Bipartisan group of lawmakers express shock, grief after Charlie Kirk’s killing

WATCH: Bipartisan group of lawmakers express shock, grief after Charlie Kirk’s killing

Aaron Boone recalls ‘special night’ after Trump’s meeting with Yankees on 9/11 anniversary

Aaron Boone recalls ‘special night’ after Trump’s meeting with Yankees on 9/11 anniversary

15 Home Defense Gadgets That Leave No Chance for Criminals

15 Home Defense Gadgets That Leave No Chance for Criminals

Fox News Poll: Voters want greater involvement in Ukraine, support current approach in Israel

Fox News Poll: Voters want greater involvement in Ukraine, support current approach in Israel

Latest News
15 Must-Have Survival Gadgets for Your Next Camping Trip!

15 Must-Have Survival Gadgets for Your Next Camping Trip!

September 12, 2025
Oakland Pawn Shop Owner Shoots 5 Of 5 Robbery Suspects In Deadly Gunfight

Oakland Pawn Shop Owner Shoots 5 Of 5 Robbery Suspects In Deadly Gunfight

September 12, 2025
Fox News Poll: Trump’s ratings are strong on border security, weak on the economy

Fox News Poll: Trump’s ratings are strong on border security, weak on the economy

September 12, 2025
NBA trailblazer Jason Collins receiving treatment for brain tumor, family announces

NBA trailblazer Jason Collins receiving treatment for brain tumor, family announces

September 12, 2025
Top 15 Ultimate Survival Gadgets for Camping

Top 15 Ultimate Survival Gadgets for Camping

September 12, 2025
Copyright © 2025. Truth Republican. All rights reserved.
  • Privacy Policy
  • Terms of use
  • Contact

Type above and press Enter to search. Press Esc to cancel.