Get All Access for $5/mo

It's Election Day: Is Facebook Influencing Your Voting Decision? Because Facebook's newsfeed algorithm gives you the news you 'want' to see, it's an echo chamber of your political biases.

Edited by Dan Bova

Keith Bedford | The Boston Globe | Getty Images

The 2016 presidential election is today. Did you vote? And, more important, did you know why you selected him/her?

Related: Facebook Employees to Undergo Political Bias Training

Maybe you've voted the same party slate all your life, or maybe you made up your mind months ago. But are you absolutely certain your political leanings haven't been influenced by an external force? An external force like . . . Facebook?

Facebook, home to more than a billion active users across the globe, has more control over what you see, how you interact and even how you think and feel than you might imagine. Is it possible that the social media giant, now worth more than $350 billion, is, intentionally or unintentionally, influencing your vote?

The Power of an algorithm

According to a recent report from Pew research, 62 percent of Americans today get their news from social media. Millions of users log in to Facebook daily to find news stories, articles and posts from their friends and followers organized in a newsfeed where they get most of their information. Hypothetically, if these news stories are slanted one way or the other, that could eventually sway your decision in favor of one candidate.

Sure, it's true that the number of voters who might truly be persuaded in this way is smaller than most people realize -- around 800,000 even in major swing states. But that's still enough to bear a significant impact on the results of this or any election.

If Facebook, say, automatically filtered out posts about one candidate, or artificially boosted slanted content that leaned one way or the other, the platform would have the power to influence, if not outright decide, the results of a particular election. But would Facebook do this? Could it?

The 'echo chamber' effect

On the surface, Facebook's newsfeed algorithm functions intuitively -- some would say brilliantly -- to provide you with the news you actually want to see. It gauges your likes, interests and history of interaction to hand-pick the stories that will appear atop your newsfeed.

However, there's a downside to this approach, leading to what some experts call an echo chamber. Partially in response to accusations of political bias (more on that in the next section), data scientists at Facebook researched anonymous data from 10.1 million user accounts to determine whether any political bias was at play.

The data scientists found that users did have a tendency to encounter stories that reinforced their own beliefs more than stories that opposed or challenged them; however, that occurred largely due to the fact that those users had (and have) a tendency to"like" and interact with those types of articles.

Related: Facebook's Zuckerberg to Meet Conservatives on Political Bias Scandal

In effect, users create their own echo chambers, and end up being fed with content biased toward whatever direction they're leaning to in the first place. This fact could have a major effect in solidifying and strengthening existing user beliefs.

Historical precedent

Facebook has been known to influence user behaviors before. In a May 2016 report from Gizmodo, former Facebook news curators suggested that they routinely suppressed pieces of conservative news, as instructed by managers. Facebook has repeatedly denied this same accusation; instead, the platform says that it strives for neutrality on all issues, and has even adjusted its newsfeed algorithm in response to such claims, in an effort to demonstrate and further improve that neutrality.

We do know for a fact that Facebook intentionally manipulated the emotions of nearly 700,000 users back in 2014, in an effort to determine the effects that its newsfeed could have. In the study, Facebook artificially filled newsfeeds with negative or positive posts, then measured the subjective emotional impact of each introduction.

The result was clear: The mere presence of more positive or negative posts in a newsfeed had a significant impact on the tone of future status updates those users posted. This information suggests that not only is it possible for Facebook to manipulate user viewpoints, but that the platform itself has had a hand in it in the past.

Zuckerberg and personal views

Earlier this year, Facebook employees asked Mark Zuckerberg directly: "What responsibility does Facebook have to help prevent President Trump in 2017?" It's not known how or if Zuckerberg responded, but the company reiterated its stance of neutrality when questioned by the media over the poll. Hypothetically, Facebook could deal a massive blow to the Trump campaign by hiding or suppressing Trump-related stories. So is Facebook's word enough to assure the world that it wouldn't step in?

Voter turnout

It's also worth noting that Facebook has had a positive -- and more importantly, neutral -- impact on elections in the past. According to research from the Journal of Communication, voters who tagged friends in reminders on the platform increased turnout by 15 to 24 percent. Links encouraging voter registration appear capable of influencing hundreds of thousands of signups as well. In this instance, Facebook is influencing votes by encouraging more votes to take place.

Other platforms

It's also important to keep in mind that Facebook isn't the only platform or company capable of generating this level of influence. Other social media platforms may hold similar, if lesser abilities; Instagram, also owned by Facebook, is home to more than 500 million active users. Google sees even more daily traffic than Facebook, and could hold a similar position of influence in suppressing various stories from search results.

The truth is, no matter where you look, there are platforms and mediums that have the power to influence your opinions.

Should you be suspicious?

There are a few key takeaways here. The first is that Facebook has the power to manipulate your voting decision by changing its algorithm; however, it has vocally insisted that it remains as neutral as possible, and has even updated its algorithms to maintain this stance.

The second is that Facebook is an echo chamber that exists due to your own actions, and that it accordingly will probably strengthen and reinforce any beliefs you hold.

It's hard to say to what extent Facebook is or should be responsible for influencing voter decisions, but if nothing else, the amount of public scrutiny over the social network's influence should keep the platform as neutral as possible throughout election season.

Related: Facebook Launches Investigation Into Report of Political Bias

Despite that fact, it's still your responsibility as a voter to escape the echo chamber and to get as much information as possible -- from a number of different sources -- to inform your decision.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Editor's Pick

Franchise 500 Annual Ranking

50 Franchise CMOs Who Are Changing the Game

Get to know the industry's most influential marketing power players.

Business Ideas

63 Small Business Ideas to Start in 2024

We put together a list of the best, most profitable small business ideas for entrepreneurs to pursue in 2024.

Living

70% of Small Business Owners Experience Monthly Burnout. Follow These 3 Rules to Avoid the Same Fate.

Here are three guidelines to help entrepreneurs achieve balance, growth and success in both their professional and personal endeavors.