Editor’s note: Find the latest COVID-19 news and guidance in Medscape’s Coronavirus Resource Center.
YouTube is wiping vaccine misinformation and conspiracy theories from its popular video-sharing platform.
The ban on vaccine misinformation, announced in a blog post on Wednesday, comes as countries around the world continue to offer free immunizations for COVID-19 to a somewhat hesitant public. Public health officials have struggled to push back against a steady current of online misinformation about the COVID-19 shot since development of the immunization first got underway last year.
YouTube’s new rules will prohibit misinformation about any vaccine that has been approved by health authorities such as the World Health Organization and are currently being administered. The platform had already begun to crack down late last year on false claims about the COVID-19 vaccine.
YouTube, which is owned by Google, will delete videos that falsely claim vaccines are dangerous or cause health issues, like cancer, infertility or autism — a theory that scientists have discredited for decades but has endured on the internet. As of Wednesday, popular anti-vaccine accounts, including those run by Robert F. Kennedy Jr., were kicked off YouTube.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” YouTube said in a prepared statement.
The new rule will apply to general claims about vaccines as well as statements about specific vaccines, such as those given for measles or flu.
Claims about vaccines that are being tested will still be allowed. Personal stories about reactions to the vaccine will also be permitted, as long as they do not come from an account that has a history of promoting vaccine misinformation.
Associated Press writer David Klepper in Providence, Rhode Island, contributed to this report.