Skip to content

Condition: Child Sections OR Post with primary [{'id': 2286704767, 'slug': 'wwjnewsradio'}, {'id': 2290417017, 'slug': 'news'}] 2286704767

Listen
Search
Please enter at least 3 characters.

Latest Stories

Study: Chatbots offer 'concerning' health advice

Despite what the bot says, raw milk can't cure cancer and chemotherapy can

AI Chatbot conversation assistant. Woman using online customer service with chat bot to get support. AI automatic answering machine. Artificial intelligence, customer support center, contact us.
AI Chatbot conversation assistant. Woman using online customer service with chat bot to get support. AI automatic answering machine. Artificial intelligence, customer support center, contact us.
Getty Images


While they may seem like your supportive bestie, experts warn that AI chatbots can provide misleading information about cancer treatments and alternative therapies.

A study evaluated the responses of various AI chatbots to questions about cancer, vaccines, stem cells, nutrition, and athletic performance -- and the results were, not so great. Specifically, researchers at the Lundquist Institute for Biomedical Innovation at Harbor-UCLA Medical Center asked a series of questions about cancer, vaccines, stem cells, nutrition and athletic performance. They tested Google’s chatbot Gemini, the Chinese model DeepSeek, Meta AI, ChatGPT and Elon Musk’s AI app, Grok.

Does antiperspirant cause cancer? Can raw milk cure cancer? Those were the kinds of questions they asked. Nearly half of the bots' responses were deemed "problematic," with some being inaccurate and lacking context. Of those, 30% were “somewhat problematic” and 19.6% were “highly problematic.”

One bot, for instance suggested that Gerson therapy is a good "alternative" cancer treatment. Gerson discourages chemotherapy.

And overall, Grok performed the worst, researchers found.

Some chatbots listed alternative cancer treatments like acupuncture, herbal medicine, and "cancer-fighting diets," potentially leading users to believe there are alternatives to chemotherapy.

The study authors wrote that "Chatbots often hallucinate, generating incorrect or misleading responses due to biased or incomplete training data, and models that are fine-tuned on human feedback are known to exhibit sycophancy—prioritizing answers that align with user beliefs over the truth."

Dr. Michael Foote, an assistant attending professor at Memorial Sloan Kettering Cancer Center, told NBC Chicago it highlights how much deceptive health information is out there.

“Some of this stuff hurts people directly,” said Foote, who is not associated with the new study. “Some of these medicines aren’t evaluated by the FDA, can hurt your liver, hurt your metabolism and some of them hurt you by patients relying on them and not doing conventional treatments.”

Despite what the bot says, raw milk can't cure cancer and chemotherapy can