News

YouTube readies to take down anti-vaccine content

Video sharing channel YouTube is set to block all anti-vaccine content, moving beyond its ban on false information about COVID-19 vaccines to include material that contains misinformation about other approved vaccines, including those that question vaccines for measles and chickenpox.

The Google-owned company said this week that the expanded policy will apply to “currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the World Health Organization (WHO).”

That would include cases where vloggers, who post content on the platform, have claimed that approved vaccines do not work, or wrongly linked them to chronic health effects.

“Specifically, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines will be removed.”

The company said “vaccines in particular have been a source of fierce debate over the years, despite consistent guidance from health authorities on their effectiveness.”

Read also: Employers spared as MPs fix NHIF fee at Sh6000 per year for all Kenyans

The online video company is also banning channels associated with several prominent anti-vaccine activists including Robert F Kennedy Jr and Joseph Mercola, a YouTube spokesperson noted.

A news email for Mercola’s website said in a statement: “We are united across the world, we will not live in fear, and we will stand together and restore our freedoms.”

YouTube said it had removed more than 130,000 videos since last year for violating its Covid-19 vaccine policies.

However, the company said it ‘will continue to allow content about vaccine policies, new vaccine trials and historical vaccine successes or failures.”

It is not the only social media giant grappling with how to deal with the spread of Covid-19 conspiracy theories and medical misinformation in general.

In September, Facebook launched a renewed effort to tackle violence and conspiracy groups, beginning by taking down a German network that has been spreading Covid-19 misinformation.

YouTube said content that “falsely says that approved vaccines cause autism, cancer, infertility, or that substances in vaccines can track those who receive them” will be taken down.

“Claims about Covid-19 or vaccines that do not violate these policies will still be eligible for review by our third-party fact-checkers, and if they are rated false, they will be labeled and demoted,” Facebook noted in a blog post.

“As with any significant update, it will take time for our systems to fully ramp up enforcement,” YouTube added.

In July, US President Joe Biden said that social media companies are “killing people” by failing to police misinformation on their platforms about COVID-19 vaccines.

[email protected]

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every month.

We don’t spam! Read our privacy policy for more info.