Explained: Why YouTube has blocked anti-vaccine content

So far, YouTube has already removed over 130,000 videos for violating Covid-19 vaccine policies.

On Wednesday, YouTube announced it would be expanding its medical misinformation policies with new guidelines on vaccines, which include vaccines that work against Covid-19 as well as general statements about other vaccines as well.

The move comes amid criticism that social media platforms are not doing enough to tackle misinformation related to Covid-19.

What kind of misinformation is YouTube targeting?

According to the new policy that came into effect on Wednesday (September 29), any kind of content that says approved Covid-19 vaccines cause autism, cancer or infertility, or claims substances in vaccines can track those who receive them, will be removed.

Further, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, or claim that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances in the vaccines will be removed.

In a blog post, YouTube said its Community Guidelines already prohibit certain types of medical misinformation, including content that promotes harmful remedies, like information that drinking turpentine can cure diseases.

Since the pandemic began, the platform has already been targeting Covid-19 and medical misinformation. So far, YouTube has already removed over 130,000 videos for violating Covid-19 vaccine policies.

https://youtube.com/watch?v=HIm-0l_OmDk%3Fversion%3D3%26%23038%3Brel%3D1%26%23038%3Bshowsearch%3D0%26%23038%3Bshowinfo%3D1%26%23038%3Biv_load_policy%3D1%26%23038%3Bfs%3D1%26%23038%3Bhl%3Den-US%26%23038%3Bautohide%3D2%26%23038%3Bwmode%3Dtransparent

What kind of content is treated as misinformation by YouTube?

When it comes to Covid-19 related content, YouTube treats the following as misinformation:

  • Content that encourages the use of home remedies, prayer or rituals in place of medical treatment such as consulting a doctor or going to a hospital
  • Content that claims that there’s a guaranteed cure for Covid-19
  • Content that recommends use of Ivermectin or Hydroxychloroquine for the treatment of Covid-19
  • Claims that Hydroxychloroquine is an effective treatment for Covid-19
  • Categorical claims that Ivermectin is an effective treatment for Covid-19
  • Claims that Ivermectin and Hydroxychloroquine are safe to use in the treatment Covid-19
  • Other content that discourages people from consulting a medical professional or seeking medical advice

One of the possible reasons for YouTube’s decision to expand its misinformation campaign is vaccine hesitancy, especially across the United States.

As per a survey conducted by the Pew Research Center, Democrats in the US are far more likely than Republicans to have received at least one dose of a Covid-19 vaccine. This survey also states a person’s vaccination status is strongly linked with confidence in the vaccine research and development process.

About 81 per cent of the respondents of this survey said they didn’t know if there were serious health risks from Covid-19 vaccines, and 80 per cent said public health officials were not telling them everything they know about these vaccines.

Newsletter | Click to get the day’s best explainers in your inbox

Source: Read Full Article