Facebook Uses Artificial Intelligence To Spot Potentially Suicidal Users

it is clear that Facebook founder Mark Zuckerberg feels a social responsibility to make the world a better place and improve people's lives. That is evident through Facebook's philanthropy efforts, including the social network's goal of bringing Internet access to remote regions of the worlds, though there are other ways Facebook wants to leave a positive mark on the world as well. For example, Facebook today is introducing new tools that provide support to people who might be suicidal.

Facebook already has some tools and services in place, though it is adding new ones, along with making available additional resources through Facebook Live and Messenger for friends of those who might be feeling suicidal. One example is when watching someone's Facebook Live video. If the person says or does something that makes the viewer think he or she might be having feelings of suicide, they can reach out to the person and report the video to Facebook.

Facebook

Reporting the video to Facebook is not to get the person in trouble, obviously. Instead, it is a way to alert Facebook that someone might be at risk of hurting himself or herself. Facebook can provide a set of resources to the person while they're streaming live. Those resources include being able to reach out to a friend, contacting a helpline, and seeing tips, among other tools.

The alternative is to disconnect the stream to prevent viewers from seeing someone commit an unfortunate act. However, Facebook learned that doing so is not the best approach.

"Some might say we should cut off the live stream, but what we've learned is cutting off the stream too early could remove the opportunity for that person to receive help," Facebook Researcher Jennifer Guadagno told TechCrunch.

Facebook isn't going about this alone. It worked with various organizations such as the Crisis Text Line, the National Eating Disorder Association, and the National Suicidal Prevention Lifeline to make these new tools and resources available to users. And of course Facebook tapped into its technical prowess—the social networking site is using artificial intelligence and pattern recognition recognition to determine if a post is suicidal.