Get All Access for $5/mo

Critics Slam Facebook's 'Filter Bubble' Study Facebook says a study published in Science magazine proves that personal choice matters more than algorithms in determining what users see-but critics say it doesn't prove that at all.

By Mathew Ingram

This story originally appeared on Fortune Magazine

Bloomua | Shutterstock.com

Facebook has been criticized for some time for its role in creating a "filter bubble" among its users, a term that comes from a book of the same name by Eli Pariser (who went on to help create viral-content site Upworthy). Critics say the social network does this by shaping our perception of the world with its algorithmically-filtered newsfeed. Facebook, however, has come out with a study that it says proves this isn't true—if there is a filter bubble, the company says, it exists because users choose to see certain things, not because of Facebook's algorithmic filters.

But is this really what the study proves? There's considerable debate about that among social scientists knowledgeable in the field, who note that the conclusions Facebook wants us to draw—by saying, for example, that the study "establishes that … individual choices matter more than algorithms"—aren't necessarily supported by the evidence actually provided in the paper.

For one thing, these researchers point out that the study only looked at a tiny fraction of the total Facebook user population: less than 4% of the overall user base, in fact (a number which doesn't appear in the study itself but is only mentioned in an appendix). That's because the study group was selected only from those users who specifically mention their political affiliation. Needless to say, extrapolating from that to the entire 1.2 billion-user Facebook universe is a huge leap.

Sociologist Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than algorithms, it doesn't actually back this up. In fact, while that appears to be the case for conservative users, in the case of users who identified themselves as liberals, Facebook's own data shows that exposure to different ideological views is reduced more by the algorithm (8%) than it is by a user's personal choice.

But even that's not the biggest problem, Jurgenson and others say. The biggest issue is that the Facebook study pretends that individuals choosing to limit their exposure to different topics is a completely separate thing from the Facebook algorithm doing so. The study makes it seem like the two are disconnected and can be compared to each other on some kind of equal basis. But in reality, says Jurgenson, the latter exaggerates the former, because personal choices are what the algorithmic filtering is ultimately based on:

"Individual users choosing news they agree with and Facebook's algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend… what's important is the finding that [the newsfeed] algorithm exacerbates and furthers this filter bubble."

Sociologist and social-media expert Zeynep Tufekci points out in a post on Medium that trying to separate and compare these two things represents the worst "apples to oranges comparison I've seen recently," since the two things that Facebook is pretending are unrelated have significant cumulative effects, and in fact are tied directly to each other. In other words, Facebook's algorithmic filter magnifies the already human tendency to avoid news or opinions that we don't agree with.

"Comparing the individual choice to algorithmic suppression is like asking about the amount of trans fatty acids in french fries, a newly-added ingredient to the menu, and being told that hamburgers, which have long been on the menu, also have trans-fatty acids."

Christian Sandvig, an associate professor at the University of Michigan, calls the Facebook research the "not our fault" study, since it is clearly designed to absolve the social network of blame for people not being exposed to contrary news and opinion. In addition to the framing of the research — which tries to claim that being exposed to differing opinions isn't necessarily a positive thing for society — the conclusion that user choice is the big problem just doesn't ring true, says Sandvig (who has written a paper about the biased nature of Facebook's algorithm).

"The tobacco industry might once have funded a study that says that smoking is less dangerous than coal mining, but here we have a study about coal miners smoking…. there is no scenario in which user choices vs. the algorithm can be traded off, because they happen together. Users select from what the algorithm already filtered for them. It is a sequence."

Jurgenson also talks about this, and about how Facebook's attempt to argue that its algorithm is somehow unbiased or neutral ­— and that the big problem is what users decide to click on and share — is disingenuous. The whole reason why some (including Tufekci, who has written about this before) are so concerned about algorithmic filtering is that users' behavior is ultimately determined by that filtering. The two processes are symbiotic, so arguing that one is worse than the other makes no sense.

In other words, not only does the study not actually prove what it claims to prove, but the argument that the site is making in defense of its algorithm also isn't supported by the facts—and in fact, can't actually be proven by the study as it currently exists. And as Eli Pariser points out in his piece on Medium about the research, the study also can't be reproduced (a crucial element of any scientific research) because the only people who are allowed access to the necessary data are researchers who work for Facebook.

Mathew Ingram is a senior writer at Fortune with a focus on media and technology. He is based in Toronto.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Living

These Are the 'Wealthiest and Safest' Places to Retire in the U.S. None of Them Are in Florida — and 2 States Swept the List.

More than 338,000 U.S. residents retired to a new home in 2023 — a 44% increase year over year.

Business News

DOGE Leaders Elon Musk and Vivek Ramaswamy Say Mandating In-Person Work Would Make 'a Wave' of Federal Employees Quit

The two published an op-ed outlining their goals for their new department, including workforce reductions.

Starting a Business

This Sommelier's 'Laughable' Idea Is Disrupting the $385 Billion Wine Industry

Kristin Olszewski, founder of Nomadica, is bringing premium wine to aluminum cans, and major retailers are taking note.

Starting a Business

He Started a Business That Surpassed $100 Million in Under 3 Years: 'Consistent Revenue Right Out of the Gate'

Ryan Close, founder and CEO of Bartesian, had run a few small businesses on the side — but none of them excited him as much as the idea for a home cocktail machine.

Business News

Prime Bank: Empowering Growth as Kenya's Premier Banking Partner

Established in 1992, Prime Bank is one of Kenya's leading banks and a trusted partner for individuals, businesses, and communities across the country. With a nationwide network of 24 branches, Prime Bank offers clients a comprehensive suite of banking products and services tailored to meet their specific needs. The bank is also present in several regional markets, including Malawi, Mozambique, Botswana, Zambia, and Zimbabwe.

Growing a Business

Why Business Growth Plateaus — and 4 Proven Tips for Quickly Overcoming It

Is your business stuck in a frustrating plateau, with growth stalled and no clear path forward? Discover the surprising reasons why most companies hit this wall — and the game-changing strategies you need to break through and start scaling again!