Critics Slam Facebook's 'Filter Bubble' Study Facebook says a study published in Science magazine proves that personal choice matters more than algorithms in determining what users see-but critics say it doesn't prove that at all.

By Mathew Ingram

This story originally appeared on Fortune Magazine

Bloomua | Shutterstock.com

Facebook has been criticized for some time for its role in creating a "filter bubble" among its users, a term that comes from a book of the same name by Eli Pariser (who went on to help create viral-content site Upworthy). Critics say the social network does this by shaping our perception of the world with its algorithmically-filtered newsfeed. Facebook, however, has come out with a study that it says proves this isn't true—if there is a filter bubble, the company says, it exists because users choose to see certain things, not because of Facebook's algorithmic filters.

But is this really what the study proves? There's considerable debate about that among social scientists knowledgeable in the field, who note that the conclusions Facebook wants us to draw—by saying, for example, that the study "establishes that … individual choices matter more than algorithms"—aren't necessarily supported by the evidence actually provided in the paper.

For one thing, these researchers point out that the study only looked at a tiny fraction of the total Facebook user population: less than 4% of the overall user base, in fact (a number which doesn't appear in the study itself but is only mentioned in an appendix). That's because the study group was selected only from those users who specifically mention their political affiliation. Needless to say, extrapolating from that to the entire 1.2 billion-user Facebook universe is a huge leap.

Sociologist Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than algorithms, it doesn't actually back this up. In fact, while that appears to be the case for conservative users, in the case of users who identified themselves as liberals, Facebook's own data shows that exposure to different ideological views is reduced more by the algorithm (8%) than it is by a user's personal choice.

But even that's not the biggest problem, Jurgenson and others say. The biggest issue is that the Facebook study pretends that individuals choosing to limit their exposure to different topics is a completely separate thing from the Facebook algorithm doing so. The study makes it seem like the two are disconnected and can be compared to each other on some kind of equal basis. But in reality, says Jurgenson, the latter exaggerates the former, because personal choices are what the algorithmic filtering is ultimately based on:

"Individual users choosing news they agree with and Facebook's algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend… what's important is the finding that [the newsfeed] algorithm exacerbates and furthers this filter bubble."

Sociologist and social-media expert Zeynep Tufekci points out in a post on Medium that trying to separate and compare these two things represents the worst "apples to oranges comparison I've seen recently," since the two things that Facebook is pretending are unrelated have significant cumulative effects, and in fact are tied directly to each other. In other words, Facebook's algorithmic filter magnifies the already human tendency to avoid news or opinions that we don't agree with.

"Comparing the individual choice to algorithmic suppression is like asking about the amount of trans fatty acids in french fries, a newly-added ingredient to the menu, and being told that hamburgers, which have long been on the menu, also have trans-fatty acids."

Christian Sandvig, an associate professor at the University of Michigan, calls the Facebook research the "not our fault" study, since it is clearly designed to absolve the social network of blame for people not being exposed to contrary news and opinion. In addition to the framing of the research — which tries to claim that being exposed to differing opinions isn't necessarily a positive thing for society — the conclusion that user choice is the big problem just doesn't ring true, says Sandvig (who has written a paper about the biased nature of Facebook's algorithm).

"The tobacco industry might once have funded a study that says that smoking is less dangerous than coal mining, but here we have a study about coal miners smoking…. there is no scenario in which user choices vs. the algorithm can be traded off, because they happen together. Users select from what the algorithm already filtered for them. It is a sequence."

Jurgenson also talks about this, and about how Facebook's attempt to argue that its algorithm is somehow unbiased or neutral ­— and that the big problem is what users decide to click on and share — is disingenuous. The whole reason why some (including Tufekci, who has written about this before) are so concerned about algorithmic filtering is that users' behavior is ultimately determined by that filtering. The two processes are symbiotic, so arguing that one is worse than the other makes no sense.

In other words, not only does the study not actually prove what it claims to prove, but the argument that the site is making in defense of its algorithm also isn't supported by the facts—and in fact, can't actually be proven by the study as it currently exists. And as Eli Pariser points out in his piece on Medium about the research, the study also can't be reproduced (a crucial element of any scientific research) because the only people who are allowed access to the necessary data are researchers who work for Facebook.

Mathew Ingram is a senior writer at Fortune with a focus on media and technology. He is based in Toronto.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Business Ideas

63 Small Business Ideas to Start in 2024

We put together a list of the best, most profitable small business ideas for entrepreneurs to pursue in 2024.

Leadership

The End of Bureaucracy — How Leadership Must Evolve in the Age of Artificial Intelligence

What if bureaucracy, the very system designed to maintain order, is now the greatest obstacle to progress?

Business Ideas

Is Your Business Healthy? Why Every Entrepreneur Needs To Do These 3 Checkups Every Year

You can't plan for the new year until you complete these checkups.

Business News

A New Hampshire City Was Named the Hottest Housing Market in the U.S. This Year. Here's the Top 10 for 2024.

Zillow released its annual lists featuring the top housing markets, small towns, coastal cities, and geographic regions. Here's a look at the top real estate markets and towns in 2024.

Business News

A Government Shutdown Could Cost the U.S. Economy $6 Billion a Week, According to EY's Chief Economist

Experts from EY tell Entrepreneur that a government shutdown could leave "a visible mark" on the economy.