The Wall Street Journal: YouTube to remove videos containing vaccine misinformation

This post was originally published on this site

YouTube said Wednesday it would remove content that falsely alleges approved vaccines are dangerous and cause severe health effects, expanding the video platform’s efforts in curbing misinformation on Covid-19 to other vaccines.

Examples of content that would be taken down include false claims that approved vaccines cause autism, cancer or infertility or that they don’t reduce transmission or contraction of diseases, the Alphabet Inc.
GOOG,
-0.35%

GOOGL,
-0.50%

division said.  

The policies cover general statements about vaccines and not only for Covid-19 and specific routine immunizations such as those for measles and hepatitis B, YouTube said. The platform said it has removed more than 130,000 videos for violating its Covid-19 vaccine policies since last year.

YouTube said it would continue to allow videos on vaccine policies, new vaccine trials and historical vaccine successes or failures, as well as personal testimonials related to the vaccines. Those exceptions reflect what the company sees as the importance of public discussion and debate, it said.

Other social-media platforms also have policies in place to suppress Covid-19 falsehoods. Twitter
TWTR,
-2.27%

earlier this year said it began applying labels to tweets pertaining to vaccines that include conspiracy theories and rhetoric unfounded in research or credible reporting.

Facebook 
FB,
+0.06%

has also aimed to use its resources to promote Covid-19 vaccines. But Facebook researchers warned that comments on vaccine-related posts—often factual posts of the sort Facebook sought to promote—were filled with antivaccine rhetoric aimed at undermining their message, internal documents reviewed by The Wall Street Journal show. A company spokesman said for people in the U.S. on Facebook, vaccine hesitancy has declined by about 50% since January.

An expanded version of this story appears on WSJ.com.

Add Comment