YouTube announced on Wednesday that it will delete content that erroneously suggests authorised vaccinations are harmful, as social media platforms try to combat health misconceptions about Covid-19 and other illnesses.
YouTube has already deleted posts that propagate false misconceptions regarding coronavirus remedies, including those that make misleading statements about Covid-19 vaccinations that have been proved to be safe.
The Google-owned site, however, stated that its reservations about the propagation of medical false narratives extended beyond the epidemic.
Also Read: Google fires an executive over anti-sematic post from 2007
“We’ve progressively seen misleading statements regarding coronavirus vaccinations cross over into disinformation about vaccines in general,” stated the Google-owned website in a statement.
“We are currently at a crossroads where it is more critical than ever to spread the work we began with Covid-19 to additional vaccines.” The extended policy would apply to “currently given vaccinations that have been approved and proven by local health authorities and the WHO (World Health Organization) to be safe and effective.” It will result in the removal of misleading claims regarding regular immunizations for illnesses such as measles and Hepatitis B from YouTube.
These would include instances in which vloggers stated that authorised vaccinations did not function or incorrectly connected them to long-term health consequences.
Content that “falsely claims that authorised vaccinations cause autism, cancer, or infertility, or that vaccine ingredients can monitor those who get them” will also be removed.
“As with any big change, our algorithms will take time to completely scale up enforcement,” YouTube remarked.
Read More: Huawei, plans to launch self-driving cars by 2025
It emphasised that there will be exceptions to the new restrictions, with the personal testimony of unpleasant vaccination experiences still permitted as long as “the channel does not indicate a trend of encouraging vaccine hesitancy.” Since last year, YouTube has deleted more than 130,000 videos for breaching its Covid-19 vaccination policy.
On Tuesday, the firm informed German media that it has stopped Russian state broadcaster RT’s German-language channels for breaking its Covid disinformation standards.