Facebook Threat to Public Health Due to COVID Misinformation Problem: Report
Sun, April 11, 2021

Facebook Threat to Public Health Due to COVID Misinformation Problem: Report

 

An activist group called the attention of Facebook due to the spread of health misinformation. The group's report suggested the platform's failure to protect people from fake news during this pandemic.

Avaaz, a US-based nonprofit organization, published a report containing health misinformation on Facebook. The group unveiled key health misinformation and how viewers were gathered by misinformation spreaders. They also suggested two solutions to further improve countermeasures against misinformation on the platform. However, Facebook said that the group failed to show steps it has done to fight the issue.

Facebook Algorithm vs. COVID Misinformation

In this pandemic, news is crucial in disseminating information about healthcare, vaccine research, and safety protocols. But digitalization makes it harder for many organizations to spot fake news spreading online. Even if caught, at least one person may have interacted with it and shared it elsewhere. As such, fake news can spread like wildfire and if related to COVID-19, many can be tricked to do something they are not supposed to.

Activist group, Avaaz, recently published a report regarding Facebook's effectiveness against misinformation. The group analyzed spreaders of fake news on the platform and identified how many were detected by Facebook's algorithm. Based on their findings, only 16% of analyzed health misinformation had a warning label from Facebook. While the other 84% of posts remained online without a warning label.

"Facebook is failing to keep people safe and informed during the pandemic," wrote Avaaz.

In the report, the authors identified 82 websites of health misinformation, 42 pages of misinformation superspreaders, and total views for global health misinformation. The 82 websites were sampled from a database of 5,080 website credibility reviews. NewsGuard curated those websites and the 82 samples were either rated as "red" - failure to meet basic credibility and transparency standards - or met other criteria.

CrowdTangle, a social media listening platform, helped Avaaz identify Facebook groups and pages that had 100,000 interactions, at least. These were compared to the 82 websites spreading misinformation from May 28, 2019, to May 27, 2020. A total of 42 Facebook pages yielded from the comparison and were called superspreaders. The pages could share content at massive rates or amplify content to effectively spread health misinformation.

Next, the authors calculated the total views for health information among the pages and websites. They recorded interactions with 82 websites between May 28, 2019, and May 27, 2020. A total of 91,019,790 interactions were recorded. While 65,822,067 interactions were estimated from the 42 pages within the same period. Interactions with the pages from external links were excluded to avoid double-counting, which resulted in 39,014,171 interactions.

From May 28, 2019, to May 27, 2020, the total interactions of the 82 Facebook websites were 91,019,790. In the same period, the total interactions of the 42 Facebook pages were 39,014,171. With a ratio of 29.70, the authors reached the final estimate of 3,861,632,821 views.

The key findings of the report include the networks spreading misinformation, claims that gathered millions of views, and false details about health protocols. Around 3.8 billion views on Facebook last year were generated by misinformation spreading networks in at least five countries. Only 16% of content sampled in the report were labeled by Facebook, while 84% were missed and remained online without a warning label.

Several claims generated lots of views on Facebook. The Bill Gates vaccine conspiracy had more than 8.4 million views. The conspiracy claimed that the Microsoft founder supported a polio vaccination in India, which led to 490,000 children paralyzed. Fact-checks were done and debunked that claim and eight other claims linked Bill Gates or vaccines.

 

 

Cure misinformation could potentially cause immediate harm to people. One of the cure claims suggested colloidal silver for bacterial and viral infections, such as tuberculosis and Ebola virus disease. This claim yielded at least 4.5 million views on Facebook. The claim described colloidal silver as a cure without side effects. The views were achieved even after Health Feedback labeled the claim as inaccurate. Eventually, Facebook labeled the claim as false information but many already interacted with it.

In more recent developments, a conspiracy of the relationship between COVID-19 and 5G technology gathered 13.4 million views. One article suggested that 5G is being used for wide-scale human experiments. That story got modified and disseminated online, in which many interacted with 5G's link to the coronavirus. No study yet found compelling evidence that 5G could harm human health.

Meanwhile, two stories seemed to undermine the healthcare sector struggling since the pandemic began. One story claimed that the American Medical Association encouraged doctors to overcount COVID-19 deaths. This received 160.5 million views – a result of the mischaracterization of Dr. Scott Jensen's comments.

The other story claimed that quarantine made no sense and it yielded over 2.4 million views. Two doctors in California said that quarantine could be bad for public health. The story claimed that the measure would prolong COVID-19. The claim was fact-checked and debunked by several credible organizations. If people believed that quarantine could be bad for them, they might have ignored the health protocol, left their homes, and potentially exposed themselves to SARS-CoV-2.

 

 

Two Solutions to Contain Health Misinformation

Despite the staggering details in the report, Avaaz hinted two possible solutions that may help Facebook better handle health misinformation. First, records on the platform should be corrected by letting users make independently fact-checked corrections. This could decrease belief in misinformation by nearly 50%. And second, detox the algorithm by lowering misinformation in the News Feeds of all users. This could limit the reach of misinformation by up to 80%.

The first method can allow a user to fact-check and submit corrections to a post. If the post receives multiple fact-checked corrections, it can be flagged as potential misinformation. But corrections must be retroactive to reverse the effects of misinformation among those who interacted with the post earlier. The second method or detox the algorithm means terminating free promotion of misinformation. This promotion occurs during the extraction of misinformation videos from the recommendation algorithm. Doing so will cost platforms, but the effort will combat misinformation substantially.

According to the British public service broadcaster BBC, Facebook said that Avaaz's report did not show the steps it took to combat misinformation. The company placed warning labels on 98 million pieces of COVID misinformation and deleted 7 million pieces of content that could lead to immediate harm. The company also expressed sharing the same goal with the activist group – to limit misinformation.