The 'Facebook Files' Investigation Highlights Key Concerns in the Platform's Approach

The biggest social media news story of the week has been ‘The Facebook Files’, a selection of internal documents revealing various investigations into the societal impacts of The Social Network, as reported by The Wall Street Journal.

The full Facebook Files series is available here, and is worth reading for anyone interested in the impacts of social media more broadly, but in summary, the key discoveries of the reports are:

  • Facebook has a system in place which subjects high profile users to a different review process than regular users
  • Facebook-commissioned studies have repeatedly found that Instagram can have harmful mental health impacts on users
  • Facebook’s ‘Family and Friends’ algorithm update in 2018, designed to reduce angst on the platform, actually increased division
  • Facebook is not doing enough to address potential harms it’s causing in developing nations
  • Anti-vaccine activists have used Facebook to sow doubt and spread fear about the COVID-19 vaccine deployment

None of these revelations in themselves are anything knew – everyone who’s done any research into Facebook and its algorithms would be aware of the harms that it can, and has caused over time, and Facebook itself has said that it is addressing all of these elements, and evolving its tools in line with its internal findings.

But what’s interesting about the Facebook Files is the revelation of what Facebook itself actually knows, and what its own data has shown in regards to these impacts, which also suggests that it could be doing more to address such.

Is it hesitating because of concerns over business impacts? That’s the bottom line of the WSJ investigation, that Facebook knows that it’s causing widespread societal harm, and amplifying negative elements, but it’s been slow to act on such because it could hurt usage.

For example, according to the leaked documents, Facebook implement its ‘Friends and Family’ News Feed algorithm update in 2018 in order to amplify engagement between users, and reduce political discussion, which had become an increasingly divisive element in the app. Facebook did this by allocating points for different types of engagement with posts.

As you can see in this overview, Likes were allocated 1 point each, with other reaction types garnering 5 points, along with re-shares, while comments drove much higher value, with ‘significant’ comments earning 30 points (non-significant comments were worth 15 points). The higher the total value of each post, the more likely it would see more reach, as Facebook used this score to determine increased relevance between connections.

The idea was that this would incentivize more discussion, but as you can imagine, the update instead prompted more publishers and media outlets to share increasingly divisive, emotionally-charged posts, in order to incite more comments and reactions, and get higher share scores for their content. Likes were no longer the key driver, Facebook’s change made comments and Reactions (like ‘Angry’) far more valuable, so sparking discussion around political trends actually became more prominent, and exposed more users to such content in their feeds.

Which highlights another of Facebook’s core issues, that it amplifies exposure to political views that you may not have ever known. You might not, for example, have any idea that your former colleague is also a flat-earth conspiracy theorist, but Facebook shows you, which then, inevitably, pushes each person more for or against each issue, essentially prompting more people to take sides.

Facebook knew that this was happening, that the change was causing increased division and argument as a result, its internal research showed it. But did it reverse course on its decision?

According to WSJ, Facebook CEO Mark Zuckerberg resisted calls to change course with the algorithm yet again, because the update had lead to more comments, addressing a longer-term decline in in-app engagement.

Facebook engagement decline

Given that Facebook is used by some 2.9 billion people, and has arguably the largest influence of any platform in history, insights like this are a major concern, as they suggest that Facebook has actively made business-based decisions on issues relating to societal harm. Which, again, is no major surprise – Facebook is, after all, a money-making business. But the influence and power the platform has to guide real-world trends is too significant to ignore such impacts – and that’s only one of the examples highlighted in WSJ’s reporting.

Other revelations relate to Instagram’s impact on young users:

“32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse […] Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.”

Instagram is doing more to provide more protection and support over time, but again, the impact, the real world effect here is significant.

Then there’s the way the platform influences people’s responses to key news events, like, say, the COVID-19 vaccine rollout.

41% of comments on English-language vaccine-related posts risked discouraging vaccinations. Users were seeing comments on vaccine-related posts 775 million times a day, and Facebook researchers worried that the large proportion of negative comments could influence perceptions of the vaccine’s safety.”

Unlike most other businesses, Facebook decisions can significantly shift public perception, and lead to real-world harms, on a massive scale.

Again, we know this, but now we also know that Facebook does too.

The concern, moving forward, is how it will move to address such, and whether the approach it’s taken thus far, in working to keep such revelations from the public, and even leaving harmful changes in place to further its business interests, will be how it continues to operate.

We don’t have any insights into how Facebook operates, as it’s is not a public utility. But at the same time, it really is. Some 70% of Americans now rely on the platform for news content, and as these insights show, it has become a key source of influence in many respects.

But at the same time, Facebook is a business. Its intention is to make money, and that will always play a key role in its thinking.

Is that a sustainable path forward for such a massive platform, especially as it continues to expand into new, developing regions, and more immersive technologies?

The Facebook Files raises some key questions, for which we don’t have any real answers as yet.  

You can read The Wall Street Journal’s full ‘Facebook Files’ series here.

Source: www.socialmediatoday.com, originally published on 2021-09-17 17:34:39