Get All Access for $5/mo

Facebook Updates Its Suicide Prevention Tools The social-media platform has made it easier for users to reach out to their connections whose activity on Facebook indicates they are at risk for self-harm or suicide.

By Laura Entis

Opinions expressed by Entrepreneur contributors are their own.

Reuters | Robert Galbraith

Yesterday, Facebook announced that it is launching a new tool that will make it easier for users to intervene if they are worried about a friend's risk of suicide.

While the social network has allowed users to report potentially suicidal posts on the platform since 2011 (via submitting a screenshot or a link to the concerning post), this updated feature allows users to flag troubling posts directly.

After flagging a post, users will receive a message with links to three available options: directly message the potentially suicidal person, reach out to another Facebook friend for support or connect with a professional at a suicide hotline.

facebook safety post

Image credit: Facebook Safety

According to the blog post announcing the changes, trained teams at Facebook will review each flagged post in question and, if they decide it's necessary, will send the person who posted the message a note encouraging them to speak with a mental-health expert at the National Suicide Prevention Lifeline.

Related: Facebook Basically Shrugs Off User Outrage Over 'Emotional' Experiment

"We have teams working around the world, 24/7, who review any report that comes in. They prioritize the most serious reports, like self-injury, and send help and resources to those in distress," the Facebook post reads.

This is certainly an admirable update; one could even argue that it is a necessary one. As Facebook continues to consume more of our time and account for an increasing percentage of our interactions with friends, family and acquaintances, it's inevitable that expressions of real pain and distress will be shared over the platform. For better or worse, Facebook is a big part of how we communicate. By making it easier for users to reach out to connections on the platform who appear to be at risk for self-harm or suicide, Facebook will undoubtedly save lives.

This makes the update an unequivocal success.

At the same time, it raises some prickly ethical questions for the company, namely exactly how Facebook's team will go about determining which reports are valid and require a response and which do not. Even with the most extensively trained experts on hand, oversights will be made.

Additionally, with the announcement of this new feature, Facebook risks wading back into some very uncomfortable territory. This summer, as you probably remember, the company faced near universal backlash after the revelation that it had manipulated content seen by more than 600,000 people to find out whether the changes would affect people's emotional state. And this kind of tweaking is in no way an isolated situation.

Related: Facebook Updates Its Privacy Policy, But Does That Mean Anything?

As revealed by a recent episode of NPR's Radiolab, the company is constantly tinkering with the way it words calls to action on its platform. For example, in an attempt to sort through the onslaught of photo removal requests it receives every day, Facebook asked users to select a reason why they wanted a particular photo removed from a list of responses ('embarrassing,' 'makes me sad' and 'bad photo of me,' to name a few.) Typically, only 50 percent of respondents selected one of the available options.

Interestingly, all it took was the addition of a single word -- it's -- for the response rate to shoot up by 28 percent to 78 percent. So, instead of the option being embarrassing, it was tweaked to "it's embarrassing."

Arturo Bejar, the director or engineering for the Facebook Protect and Care team, revealed to Radiolab that he and his team make these subtle adjustments, which often result in significant changes in response rates and/or follow-up actions, all the time.

This brings us back to Facebook's updated suicide prevention feature, which reaches out to at-risk users in the form of a carefully worded message:

facebook safety post

Image credit: Facebook Safety

Will Facebook experiment with tweaking the wording in order to make it more effective?

Rolling out this new initiative has the potential to save lives but when the stakes are this high, the prospect of any kind of experimentation -- even when the intentions are so pure -- still feels a little chilling.

Related: OkCupid Founder: 'If You Use the Internet, You're the Subject of Hundreds of Experiments'

Laura Entis is a reporter for Fortune.com's Venture section.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Editor's Pick

Business News

Looking for a Remote Job? Here Are the Most In-Demand Skills to Have on Your Resume, According to Employers.

Employers are looking for interpersonal skills like teamwork as well as specific coding skills.

Science & Technology

How AI Can (and Should) Drive Innovation Across Your Entire Organization

AI offers organizations an unprecedented chance to drive innovation across every department, yet many are missing out by seeing it as just another tech tool.

Business News

'It's Deceptive:' Lori Greiner and Barbara Corcoran Clash on Employees 'Quiet Vacationing'

Does it matter where you work if you are getting your work done?

Marketing

How to Create a Unique Value Proposition (With Tips & Examples)

A unique value proposition will help you distill your unique strengths into a statement that answers the question: "Why should customers care?"

Growing a Business

He Was Once Head Writer of 'The Simpsons.' Now, He's the Gordon Ramsay of Fast Food — Here's How This TV Exec Found an Unlikely Career as an Influencer.

Bill Oakley wrote and produced 'The Simpsons' for years. Now, he's the center of a burgeoning community of food-obsessed fans.

Business News

Could Google Be Forced to Sell Chrome? The DOJ Is Reportedly Pushing For It.

A federal judge ruled in August that Google monopolized certain markets, including search.