House Intelligence Committee Grills Facebook Over Algorithms Facebook didn't have many answers.
By Karissa Bell
Our biggest sale — Get unlimited access to Entrepreneur.com at an unbeatable price. Use code SAVE50 at checkout.*
Claim Offer*Offer only available to new subscribers
This story originally appeared on Engadget
The House Select Committee on Intelligence met with some of the top policy officials from Facebook, Twitter and YouTube Thursday. The topic of the virtual hearing was election security and foreign interference ahead of the 2020 election. But though lawmakers pressed all the company officials in attendance on their efforts to combat election interference, Facebook, predictably, drew much of the scrutiny.
One of the most contentious issues was not Facebook's efforts to combat security threats, but its algorithms. Following a recent Wall Street Journal report that Facebook executives resisted efforts to make its social network less divisive despite internal research that found its algorithms create more polarization, and a renewed debate around content moderation, members of Congress pushed Facebook to explain exactly how its algorithms work. The company's head of cybersecurity policy, Nathaniel Gleicher, didn't have many answers.
Related: Facebook Ads: The Complete Guide to Getting Started
"I am concerned because of an issue that I raised back in 2017, and repeatedly since," Rep. Adam Schiff said in his opening remarks. "I am concerned about whether social media platforms like YouTube, Facebook, Instagram, and others, wittingly or otherwise, optimize for extreme content. These technologies are designed to engage users and keep them coming back, which is pushing us further apart and isolating Americans into information silos."
Later, Rep. Jim Himes asked Gleicher what Facebook is doing to combat polarization given potential security implications, noting that the more polarized Facebook is, the easier it is for other countries to exploit those divisions for their own gain.
"If every single American household is full of toxic, explosive gas, as I think it is today, all it takes is a match from Russia or from Iran or from North Korea or from China to set off a conflagration," Himes said. "I was very troubled by the apparent unwillingness of Facebook to in a very public and specific way come to terms with the notion that its algorithm — which is really what worries me, in terms of the security of this country — that its algorithm promotes polarization, division and anger."
Related: How to Spy on Your Competitors Using the Facebook Ad Library
Gleicher responded that Facebook has found its users don't want to see clickbait and other types of "divisive" posts. "They don't want to see clickbait, they don't want to see the type of divisive content you're describing." Gleicher said, pointing to Facebook's efforts to weed out clickbait and "refocus" its News Feed around posts from friends and families rather than pages.
The explanation didn't go over well with Himes. "You're just not resonating with me," he said.
Gleicher later attempted to clarify. "Certainly people are drawn to clickbait, they're drawn to explosive content," he said. "People don't want a platform or an experience that is just clickbait. They will click on it if they see it, but they don't want to prioritize it. They don't want their time to be drawn into that."
Both Schiff and Rep. Eric Swalwell also pressed Facebook to better explain how its algorithms prioritize and rank different types of content, and the impact those decisions have on users. Gleicher agreed that "transparency is important," but didn't provide specifics.
Related: Facebook Lets U.S. Users Turn Off Political Ads
"The algorithms we're talking about, the decision-making process we're talking about, is incredibly complex," Gleicher said. "Showing that information in a way that is consumable and meaningful is extremely important because it's very easy to jump to conclusions."
The renewed interest in Facebook's algorithms comes as the company faces more scrutiny over its content moderation policies. A day earlier, the Justice Department proposed a series of changes that would scale back legal protections provided to Facebook and other platforms under Section 230 of the Communications Decency Act.