Facebook Clashes with the US Government Over Vaccine Misinformation

It seems like Facebook may be on a collision course with the US Government once again, this time over the role that it may or may not be playing in the amplification of COVID-19 vaccine misinformation, which has been identified as a key impediment in the nation’s path to recovery from the pandemic.

On Friday, when asked directly about vaccine misinformation on Facebook, US President Joe Biden responded that ‘they’re killing people‘ by allowing vaccine conspiracy theories to spread.

Biden’s comment came a day after the White House also noted that it’s been in regular contact with social media platforms to ensure that they remain aware of the latest narratives which pose a danger to public health

As per White House press secretary Jen Psaki:

“We work to engage with them to better understand the enforcement of social media platform policy.”

In response to Biden’s remarks, Facebook immediately went on the offensive, with a Facebook spokesperson telling ABC News that it “will not be distracted by accusations which aren’t supported by the facts”.

Facebook followed that up with an official response today, in a post titled ‘Moving Past the Finger Pointing’.

At a time when COVID-19 cases are rising in America, the Biden administration has chosen to blame a handful of American social media companies. While social media plays an important role in society, it is clear that we need a whole of society approach to end this pandemic. And facts – not allegations – should help inform that effort. The fact is that vaccine acceptance among Facebook users in the US has increased. These and other facts tell a very different story to the one promoted by the administration in recent days.”

The post goes on to highlight various studies which show that Facebook’s efforts to address vaccine hesitancy are working, and that, if anything, Facebook users are less resistant to the vaccine effort, in opposition to Biden’s remarks.  

Which is largely in line with Facebook’s broader stance of late – that, based on academic research, there’s currently no definitive link between increased vaccine hesitancy and Facebook sharing, nor, on a similar path, is there any direct connection between Facebook usage and political polarization, despite ongoing claims.

In recent months, Facebook has taken a more proactive approach to dismissing these ideas, by explaining that polarizing and extremist content is actually bad for its business, despite the suggestion that it benefits from the related engagement with such posts.

As per Facebook:

“All social media platforms, including but not limited to ours, reflect what is happening in society and what’s on people’s minds at any given moment. This includes the good, the bad, and the ugly. For example, in the weeks leading up to the World Cup, posts about soccer will naturally increase – not because we have programmed our algorithms to show people content about soccer but because that’s what people are thinking about. And just like politics, soccer strikes a deep emotional chord with people. How they react – the good, the bad, and the ugly – will be reflected on social media.”

Facebook’s Vice President of Global Affairs Nick Clegg also took a similar angle back in March in his post about the News Feed being an interplay between people and platform – which means the platform itself cannot be fully to blame:

The goal is to make sure you see what you find most meaningful – not to keep you glued to your smartphone for hours on end. You can think about this sort of like a spam filter in your inbox: it helps filter out content you won’t find meaningful or relevant, and prioritizes content you will.”

Clegg further notes that Facebook actively reduces the distribution of sensational and misleading content, as well as posts that are found to be false by its independent fact-checking partners.

“For example, Facebook demotes clickbait (headlines that are misleading or exaggerated), highly sensational health claims (like those promoting “miracle cures”), and engagement bait (posts that explicitly seek to get users to engage with them).”

Clegg also says that Facebook made a particularly significant commitment to this, in conflict with its own business interests, by implementing a change to the News Feed algorithm back in 2018 which gives more weight to updates from your friends, family, and groups that you’re a part of, over content from Pages that you follow.

So, according to Facebook, it doesn’t benefit from sensationalized content and left-of-center conspiracy theories – and in fact, it actually goes out of its way to penalize such.

Yet, despite these claims, and the references to inconclusive academic papers and internal studies, the broader evidence doesn’t support Facebook’s stance.

Earlier this week, The New York Times reported that Facebook has been working to change the way that its own data analytics platform works, in order to restrict public access to insights which show that far-right posts and misinformation perform better on the platform than more balanced coverage and reports.

The controversy stems from this Twitter profile, created by Times reporter Kevin Roose, which displays a daily listing of the ten most engaging posts across Facebook, based on CrowdTangle data.

Far-right Pages always dominate the chart, which is why Facebook has prevously sought to explain that the metrics used in creating the listing are wrong, and are therefore not indicative of actual post engagement and popularity.

According to the NYT report, Facebook had actually gone further than this internally, with staffers looking for a way to alter the data displayed within CrowdTangle to avoid such comparison.

Which didn’t go as planned:

“Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad. But [Brandon] Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.”

So, no matter how Facebook was looking to spin it, these types of posts were still gaining traction, which shows that, even with the aforementioned updates and processes to limit such sharing, this remains the type of content that sees the most engagement, and thus, reach on The Social Network.

Which, you could argue, is a human problem, rather than a Facebook one. But at 2.8 billion users, giving it more potential for content amplification than any platform in history, Facebook does need to take responsibility for the role that it plays within this process, and the role it can potentially play in amplifying the impact of such in the case of, say, a pandemic where vaccine fear-mongering could end up costing the world an unmeasurable toll.

It seems fairly clear that Facebook does play a significant part within this. And when you also consider that some 70% of Americans now get at least some news content from Facebook, it’s clear that the app has become a source of truth for many, which informs what they do, including their political stances, their civic understanding. And yes, their view of public health advice.

Heck, even flat earthers have been able to gain traction in the modern age, underlining the power of anti-science movements. And again, while you can’t definitively say that Facebook is responsible for such, if somebody posts a random video of flat earthers trying to prove their theory, that’s probably going to get traction due to the divisive, sensational nature of that content – like this clip for example:

Videos like this attract believers and skeptics alike, and while many of the comments are critical, that’s all, in Facebook’s algorithmic judgment, engagement.

Thus, even your mocking remarks will help such material gain traction – and the more people who comment, the more momentum such posts get.

8 out of 10 people might dismiss such theories as total rubbish, but 2 might take the opportunity to dig deeper. Multiply that by the view counts these videos see and that’s a lot of potential influence on this front that Facebook is facilitating.

And definitely, these types of posts do gain traction. A study conducted by MIT in 2019 found that false news stories on Twitter are 70% more likely to be retweeted than those that are true, while further research into the motivations behind such activity have found that a need for belonging and community can also solidify groups around lies and misinformation as a psychological response.

There’s also another key element within this – the changing nature of media distribution itself.

As Yale University social psychologist William J. Brady recently explained:

“When you post things [on social media], you’re highly aware of the feedback that you get, the social feedback in terms of likes and shares. So when misinformation appeals to social impulses more than the truth does, it gets more attention online, which means people feel rewarded and encouraged for spreading it.”

That shift, in giving each person their own personal motivation for sharing certain content, has changed the paradigm for content reach, which has diluted the influence of publications themselves in favor of algorithms, – which, again, are fueled by people and their need for validation and response.

You share a post saying ‘vaccines are safe’ and probably no one will care, but if you share one that says ‘vaccines are dangerous’, people will pay attention, and you’ll get all the notifications from all the likes, shares and comments, which will then trigger your dopamine receptors, and make you feel part of something bigger, something more – that your voice is important in the broader landscape.

As such, Facebook is somewhat right in pointing to human nature as the culprit, and not its own systems. But it, and other platforms, have given people the medium, they provide the means to share, they devise the incentives to keep them posting.

And the more time that people spend on Facebook, the better is for Facebook’s business.

You can’t argue that Facebook doesn’t benefit in this respect – and as such, it is in the company’s interests to turn a blind eye, and pretend there’s no problem with its systems, and the role that it plays in amplifying such movements.

But it does, it is, and the US Government is right to take a closer look at this element.

Source: www.socialmediatoday.com, originally published on 2021-07-17 18:00:00