Get All Access for $5/mo

Facebook Releases Document to Explain Why It's Removing Posts Meanwhile, the social network is also, for the first time, now giving users the right to appeal its decisions on individual posts.

By Angela Moscaritolo

This story originally appeared on PCMag

via PC Mag

Ever wonder how Facebook decides what -- and who -- to remove from its platform?

Wonder no more because the social network just published the lengthy "Community Standards" its reviewers use to determine what is and isn't allowed on Facebook.

The standards are broken down into six categories: Violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property and content-related requests. They outline how Facebook deals with everything from threats of violence to suicide, self-injury, child porn and sexual exploitation, nudity, bullying, harassment, hate speech and more.

The move to publish these once internal guidelines comes after The Guardian last year obtained and posted snippets of the company's exhaustive and sometimes contradictory rules.

Facebook's VP of Global Policy Management Monika Bickert said the company is now going public with this information to "help people understand where we draw the line on nuanced issues" and as a way to solicit feedback on how it can improve its guidelines. Next month, the company plans to launch a series of public events in the U.S., U.K., Germany, France, India and Singapore called "Facebook Forums: Community Standards" to get people's feedback in person.

Facebook relies on artificial intelligence technology and reports from users to identify posts, photos and other content that may violate its standards. Upon receiving a report, a member of the company's 24/7 Community Operations team reviews the content in question to determine whether it should be taken down. Facebook currently employs more than 7,500 content reviewers.

Bickert acknowledged that Facebook's reviewers sometimes make the wrong decision.

"In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers; when that's the case, we work to fill those gaps," she wrote. "More often than not, however, we make mistakes because our processes involve people, and people are fallible."

Meanwhile, Facebook is now, for the first time, giving users the right to appeal its decisions on individual posts. This way, if the company removes your post and you think it made a mistake in doing so, you can ask for a second opinion.

At this point, you will only be able to ask for an appeal for posts removed for nudity/sexual activity, hate speech or graphic violence. If Facebook removes something you posted for one of those reasons, it will notify you about the action and give you the option to request an additional review. Within 24 hours of initiating an appeal you should know whether Facebook plans to restore your content, or keep it off the platform for good.

Angela Moscaritolo has been a PCMag reporter since January 2012. 

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Editor's Pick

Business Solutions

The One Microsoft Design Tool Business Owners Shouldn't Miss

For a limited time, you can get a lifetime license for just $20.

Devices

Holiday Savings: Get a MacBook Air for $250

At this price, get one as a gift and one for yourself.

Business Ideas

63 Small Business Ideas to Start in 2024

We put together a list of the best, most profitable small business ideas for entrepreneurs to pursue in 2024.

Starting a Business

Your Firsthand Experiences Shape the Way You Run Your Business — Here's How Mine Shaped Me

Growing up in a family-owned Chinese restaurant instilled in me the entrepreneurial spirit that now drives my tech venture. From handling customer complaints to managing staff, these early experiences shaped my approach to business and leadership.