On September 29, 2021, the Google-owned video streaming platform YouTube amended its community guidelines to fight misinformation related to health and particularly vaccines. Twitter, Facebook and YouTube were at the forefront of the start of the COVID-19 pandemic and combating misinformation with regards to COVID-19, its effects and vaccines used.
YouTube released these updated guidelines along with a video to explain why exactly this is being done. Now, in addition to the misinformation related to the pandemic, the platform will now be removing ALL misinformation related to vaccines, their efficacy, their side-effects and safety. Citing the World Health Organization’s approvals for vaccines administered globally, YouTube will now be removing and blocking channels that promote vaccine misinformation.
It is widely known that social media platforms have encouraged vaccine hesitancy all over the world, most notably in the US which has a very prominent anti-vax community. President Joe Biden also addressed this concern blaming vaccine hesitancy on social media.
Vaccine rates in the US have slowed down considerably when compared to other developed nations. Even historically Democrat/blue states such as New York have also been in the news recently when nurses started to resign due to a mandatory vaccination policy implemented by the state authorities.
The US based Center for Countering Digital Hate (CCDH) identified 12 main accounts that were amplifying vaccine misinformation, some have been at it for years. This group includes Robert F Kennedy’s son Robert F Kennedy Jr. who along with Joseph Mercola have been some of the most popular faces in the anti-vax community. This new policy means that these channels will now be disbanded from YouTube.
Other websites such as Facebook have been hesitant to implement such a strict policy because they have previously cited that this may interfere with freedom of speech. But this may start to change soon because lawmakers and common citizens have started to notice the sheer monopoly that social media platforms hold in terms of what is real and what is not and have started to come under fire for the same.
This is a good step towards regulation of dangerous content on these platforms but since this is not a universal policy, the people who get banned on YouTube can very well shift to other platforms such as Telegram and others to get their messages across.
It will be interesting to see how this policy is implemented in India as many influencers and pseudo-religious leaders continue to post misinformation against COVID-19 and other diseases, peddling cures for diseases without any data to back up their claims. A report recently found that India is a top source of health misinformation on social media.
India too has a steady mass of population that believes in WhatsApp forwards and vehemently deny taking the vaccine. This is still limited to COVID-19 vaccines but may soon stretch to other vaccines as well. It is important for social media platforms to do more to counteract health misinformation. Centre for Countering Digital Hate lists the following policies to protect users from harmful misinformation:
- Establish a clear threshold for enforcement action
- Display corrective posts to users exposed to disinformation
- Add warning screens when users click links to misinformation sites
- Institute an Accountability API
- Ban private and secret anti-vaccine Facebook Groups