A hot potato: YouTube has updated its medical misinformation policies to include new guidelines on vaccines. The Google-owned video sharing platform said it has seen a steady stream of false claims about Covid-19 vaccines pour over into misinformation about vaccines in general, adding that they've reached a point where they must expand the work they started with Covid-19 to other vaccines.
Per YouTube's new policies, it will remove content that alleges that approved vaccines are dangerous or cause chronic health effects. YouTube will also take down videos claiming vaccines do not reduce transmission or contraction of disease as well as pieces featuring misinformation about the substances contained in vaccines.
Specific examples cited include content that claims approved vaccines cause cancer, autism or infertility, or material that claims the substances in vaccines can somehow track those who receive them.
Notably, YouTube said it will continue to allow content about vaccine policies, historical vaccine successes or failures and new vaccine trials. What's more, personal testimonies relating to vaccines will also be allowed to stay up, "so long as the video doesn't violate other community guidelines, or the channel doesn't show a pattern of promoting vaccine hesitancy."
YouTube said it consulted with local and international health organizations and experts to help develop its new policies. Since last year, the site has removed over 130,000 videos for violating its Covid-19 vaccine policies, we're told.
According to CNBC, a YouTube spokesperson confirmed that it also removed channels associated with high-profile misinformation spreaders including Joseph Mercola, Sherri Tenpenny and the Children's Health Defense Fund, which is associated with Robert F. Kennedy Jr.
The changes will go into effect today, although YouTube conceded that with any major update, it will take time for their systems to get up to speed as it relates to enforcement.