Facebook's Suicide Prevention Tools: Invasive or Essential? The social network now allows all users to flag friends' posts as potentially suicidal and solicit Facebook's help or intervention.
Our biggest sale — Get unlimited access to Entrepreneur.com at an unbeatable price. Use code SAVE50 at checkout.*
Claim Offer*Offer only available to new subscribers
Opinions expressed by Entrepreneur contributors are their own.
If you have ever scrolled through your News Feed and stopped on a troubling, borderline suicidal post from a friend, you may have been unsure how to help or wondered if reaching out would be appropriate. Facebook understands that people share these types of negative personal thoughts on the platform and has developed tools to help you help your friends.
Facebook now offers resources for users who perceive a friend's posts as suicidal, allowing them to flag a post for review by a team at the company. Users can click a drop-down menu within the post in question that allows them to specify their concerns to Facebook's global community operations team. These reports are directed to employees trained to evaluate suicidal content. The team may then send the reporting user some information about suicide prevention and advice for communicating with the friend. In some cases, Facebook may intervene by contacting local law enforcement where the friend resides, according to The New York Times.
Related: Facebook Updates Its Suicide Prevention Tools
Previously, suicide prevention assistance was limited to some English-speaking Facebook users, but now it is available to everyone.
Among the tools is a page containing a form to report sightings of suicidal content to the team, along with advice for assisting friends who may be considering self-injury, those who may have an eating disorder and members of the military, LGBT individuals and law enforcement officers whose posts indicate they may be contemplating suicide. It also offers direct support to at-risk users seeking help for themselves. All of the tools contain warnings to users, advising them to take immediate action if a post explicitly states suicidal intent by calling law enforcement or a suicide hotline and directing them to said contact information.
Facebook relies on humans on both sides -- users report and team members review. None of the content is detected or evaluated using artificial intelligence or algorithms.
Related: Can We Turn to Our Smartphones During Mental Health Crises?
We asked Entrepreneur's Facebook and Twitter followers whether Facebook should allow users to solicit its employees' help in preventing suicide, or whether the company should refrain from intervening in people's personal lives. Many who responded embraced Facebook's efforts, while others thought sole responsibility should fall on the identifying users themselves. Some thought in terms of the company's image, and some asked questions about how reporting someone would affect how Facebook targets that user in the future. Read some of their comments below.
@Entrepreneur No, never. Way too Orwellian.
— Dennis Carrington (@denguy2) June 15, 2016
@Entrepreneur I'm not sure if they could prevent it? If your friends are posting suicidal posts YOU should be responsible yourself, not FB.
— Kate Chute (@katechute) June 15, 2016
@Entrepreneur "Facebook accurately predicts suicide, chooses inaction instead" doesn't go down too well for corporate social responsibility
— Blair Hudson (@blairhudson) June 15, 2016