
Posted today to YouTube’s corporate blog:
In the years since we began our efforts to make YouTube a destination for high-quality health content, we’ve learned critical lessons about developing Community Guidelines in line with local and global health authority guidance on topics that pose serious real-world risks, such as misinformation on COVID-19, vaccines, reproductive health, harmful substances, and more.
We’re taking what we’ve learned so far about the most effective ways to tackle medical misinformation to simplify our approach for creators, viewers, and partners.
Moving forward, YouTube will streamline dozens of our existing medical misinformation guidelines to fall under three categories – Prevention, Treatment, and Denial. These policies will apply to specific health conditions, treatments, and substances where content contradicts local health authorities or the World Health Organization (WHO).
To determine if a condition, treatment or substance is in scope of our medical misinformation policies, we’ll evaluate whether it’s associated with a high public health risk, publicly available guidance from health authorities around the world, whether it’s generally prone to misinformation.
Prevention misinformation: We will remove content that contradicts health authority guidance on the prevention and transmission of specific health conditions, and on the safety and efficacy of approved vaccines. For example, this encompasses content that promotes a harmful substance for disease prevention.
Treatment misinformation: We will remove content that contradicts health authority guidance on treatments for specific health conditions, including promoting specific harmful substances or practices. Examples include content that encourages unproven remedies in place of seeking medical attention for specific conditions, like promoting caesium chloride as a treatment for cancer.
Denial misinformation: We will remove content that disputes the existence of specific health conditions. This covers content that denies people have died from COVID-19.
When cancer patients and their loved ones are faced with a diagnosis, they often turn to online spaces to research symptoms, learn about treatment journeys, and find community. Our mission is to make sure that when they turn to YouTube, they can easily find high-quality content from credible health sources.
In applying our updated approach, cancer treatment misinformation fits the framework – the public health risk is high as cancer is one of the leading causes of death worldwide, there is stable consensus about safe cancer treatments from local and global health authorities, and it’s a topic that’s prone to misinformation,.
Starting today and ramping up in the coming weeks, we will begin removing content that promotes cancer treatments proven to be harmful or ineffective, or content that discourages viewers from seeking professional medical treatment.
This includes content that promotes unproven treatments in place of approved care or as a guaranteed cure, and treatments that have been specifically deemed harmful by health authorities. For instance, a video that claims “garlic cures cancer,” or “take vitamin C instead of radiation therapy” would be removed.
The platform will also take action against videos that discourage people from seeking professional medical treatment as it sets out its health policies going forward. https://t.co/eKvKnTvuQa
— The Verge (@verge) August 15, 2023