Join our Waitlist for Expert Advice!

Facebook's Content Moderation Rules Are Both Careful and Shocking The rules for dealing with violent and disturbing images often require moderators to ask whether they are 'newsworthy' or 'raise awareness.'

By Nina Zipkin

JaysonPhotography / Shutterstock.com

There is no doubt that it takes a huge effort to moderate all the content that gets uploaded to Facebook. But over the past few months, the social giant has shown signs of strain.

Back in August, shortly after the company fired a team of human editors overseeing the Trending section of the site in favor of an algorithm, a false news story found its way to the top of the queue.

In February, CEO Mark Zuckerberg published a wide-ranging open letter on his Facebook page about the direction he hopes to take the company, touching on the need for more vigilance in the face of "fake news" and also a stronger infrastructure to handle the raft of content that is posted by users on a daily basis.

Related: After Murder, Facebook to Hire 3,000 People to Review Videos

"There are billions of posts, comments and messages across our services each day, and since it's impossible to review all of them, we review content once it is reported to us," Zuckerberg wrote. "There have been terribly tragic events -- like suicides, some live streamed -- that perhaps could have been prevented if someone had realized what was happening and reported them sooner. There are cases of bullying and harassment every day, that our team must be alerted to before we can help out. These stories show we must find a way to do more."

This spring, after a murder in Cleveland was livestreamed on the platform, Zuckerberg announced that over the course of the year, 3,000 people would be hired to better tackle and improve that review process.

But now, an investigation conducted by the Guardian has identified some of the standards that Facebook operates from when it comes to moderating content, and they are perhaps more confusing than you might expect.

Related: Facebook Wants to Help You Spot Bogus News Stories

With regard to the videos of violent deaths or suicides, they are designated as disturbing content, but the reasoning Facebook has for not necessarily taking them down is because they can build awareness about mental illness, according to The Guardian's findings.

Specifically in cases of suicide, documents that The Guardian has been privy to explain that the current company dictate is "to remove them once there's no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up."

When it comes to violent language, a call to action to harm the president would be taken down because he is a head of state, but directions about how to snap a woman's neck would be allowed to remain on the site because it is not "regarded as credible threats."

Related: Facebook Pledges to 'Do Better' After Posting of Murder Video

For instances of animal abuse and graphic violence, those images and videos are also designated as disturbing, but are allowed if they are being used to educate and raise awareness, but they are not if there is an element of "sadism and celebration." For images or photos pertaining to child abuse, that rule is also applied.

According to the Guardian, moderators often have seconds to make a determination about how to characterize or whether to remove the content.

It's clear that Zuckerberg and his team have a daunting task in front of them, so Facebook's rules will need to constantly evolve to meet the challenge.

Nina Zipkin

Entrepreneur Staff

Staff Writer. Covers leadership, media, technology and culture.

Nina Zipkin is a staff writer at Entrepreneur.com. She frequently covers leadership, media, tech, startups, culture and workplace trends.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Business News

Thousands of Oasis Reunion Tour Tix Get Nixed — Instead of Fighting Each Other, the Gallagher Brothers Go to War With Resellers

Oasis is reportedly canceling thousands of tickets that were listed on the secondary market.

Living

Americans Say the Week After Daylight Saving Time Is Their Most Unproductive at Work – Here's Why

In a recent survey, 43% of employed respondents say the week after daylight saving time ends is their most unproductive at work — with 31% admitting they make more mistakes than is typical.

Leadership

How Extreme Customer Service Creates Loyal Customers and Sparks Business Domination

How to create customer loyalty and use extreme customer service as a competitive force to dominate your market: free training you can use now.

Business News

You Have One Month Left to Buy a House, According to Barbara Corcoran. Here's Why.

"If you are planning on waiting a year and seeing where interest rates go, you are out of your mind," Corcoran said.

Starting a Business

This Five-Course Startup and Development Bundle Is Only $25

Self-paced courses for your journey to success.