YouTube updates its policy to block all anti-vaccine content

YouTube logo.
Photo credit GettyImages

YouTube has announced that it will no longer keep anti-vaccine content on its site. The Google-owned platform is specifically targeting anti-vaccine activists Joseph Mercola and Robert F. Kennedy Jr., who reports say are partially responsible for growing skepticism surrounding vaccines.

The new policies from the social media giant will now ban any videos that claim commonly used vaccines approved by health authorities are ineffective or dangerous, The Washington Post reported.

Videos making claims about COVID-19 vaccines have been blocked before by YouTube but not videos about vaccines like measles or chickenpox.

Currently, 55.6% of the U.S. has received both shots of the vaccine, and experts have said that misinformation is playing a significant role in slowing vaccination rates.

Dr. Lynne Ogawa, the medical director for St. Paul and Ramsey County Public Health, talked about misinformation surrounding the vaccine with News Talk 830 WCCO's Sheletta Brundidge and how it is a threat to public health.

"What I worry about is just that spread of misinformation," Ogawa said. "Stories that… undercut all of the work that we in public health and keep trying to do, which is get good information out there. Not force anyone to do anything, but get good information out there for people to make some good choices.

"Not choices based on fear, not choices based on stories about other folks, but really good information."

Ogawa shared that it is essential to answer people's questions about the vaccine, but people should be looking to experts, not just anyone giving information.

In July, President Joe Biden claimed that social media companies were playing a part in spreading misinformation, by allowing false claims to remain on their sites.

YouTube's change will follow suit with other social media giants like Twitter and Facebook in policing its content, something it has been hesitant to do in the past after receiving backlash for demonetizing videos.

But recently, it has been in the spotlight and criticized for not taking action against misinformation. Part of the late action from YouTube is due to it focusing on misinformation specifically about coronavirus vaccines, Matt Halprin, the company's vice president of global trust and safety, said to the Post.

After seeing that incorrect claims about other vaccines were adding to the skepticism surrounding the coronavirus vaccines, it decided to expand the ban.

"Developing robust policies takes time," Halprin said. "We wanted to launch a policy that is comprehensive, enforceable with consistency, and adequately addresses the challenge."

Halprin said that the new policy will still let people make claims based on their own personal experiences, like a mother sharing side effects her child experienced after getting a vaccine. Scientific discussions of vaccines and posting about their historical failures or successes will also be allowed.

"We'll remove claims that vaccines are dangerous or cause a lot of health effects, that vaccines cause autism, cancer, infertility, or contain microchips," Halprin said.

As for how the company will decide what to keep up and take down, Halprin shared that "at least hundreds" of moderators at YouTube will work specifically on medical misinformation.

Live On-Air
Ask Your Smart Speaker to Play news talk eight three oh W C C O
830 WCCO
Listen Now
Now Playing
Now Playing
Featured Image Photo Credit: GettyImages