Skip to main content

YouTube Bans Videos That Spread Wrong Information About Vaccines

September 30, 2021

The change also extends that policy to a far wider number of vaccines

The management of the popular video platform, YouTube has declared that it will start cracking down on the spread of misinformation by banning every misleading content about vaccines.

It announced on Wednesday that its current community guidelines, which already prohibit the sharing of medical misinformation, have been extended to cover "currently administered" vaccines that have been proven safe by the World Health Organization and other health officials.

Image

The platform had previously banned content containing false claims about COVID-19 vaccines under its Covid-19 misinformation policy.

The change also extends that policy to a far wider number of vaccines, NPR reports.

The company said, "We've steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we're now at a point where it's more important than ever to expand the work we started with COVID-19 to other vaccines."

YouTube stated in its announcement that it has commenced the banning of videos that claim that vaccines are not effective, safe, and even cause different health issues.

It further stressed its policy on videos that inaccurately describe what ingredients are used in vaccines as well as allegations that vaccines contain properties that can be used to "track" those who receive them.

But there are still some exceptions as users are still allowed to share content related to their personal experiences with the vaccine, only if those videos adhere to the site's community guidelines and the channel in question doesn't routinely encourage "vaccine hesitancy."

CNBC reported that the new policy goes into effect immediately and YouTube has already removed pages known for sharing anti-vaccination sentiments such as those belonging to prominent vaccine opponents Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and Robert F. Kennedy Jr.'s Children's Health Defense organization.

But the company, which is owned by Google, warned that more widespread removal of videos may take some time as it works to enforce the policy.

NPR reported that many conspiracy theorists have started migrating to other less regulated platforms as YouTube and Facebook tightened their restrictions concerning vaccine misinformation.

Slate had reported in March that Rumble, another video-sharing site, had become a popular choice for far-right groups and others who are vaccine-resistant.

But many conservative pages that spread vaccine misinformation are still active on YouTube, and their videos continue to attract millions of views.

Topics
PUBLIC HEALTH