Facebook launched live-video streaming widely last year as a way to let people share personal moments and provide a media platform for major events like the Oscars. But the social network has had to grapple with the horrifying unintended consequence that some users have broadcast suicides and killings via Facebook Live.
The company on Wednesday said it is adding suicide-prevention tools directly into Facebook Live that will give users watching a live video the option to reach out to the person directly as well as report the video to Facebook. In addition, Facebook will provide resources to the person reporting the live video including the ability to contact a help line immediately.
Facebook also is running a test of live chat support from crisis-support orgs through Messenger,with participating organizations including Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline. It also says it’s streamlining reporting for suicide using artificial intelligence.
The steps come after at least three Facebook users live-streamed suicides since the start of 2017, including a 14-year-old Miami girl who hung herself after a two-hour Facebook Live session in January.
Popular on Variety
As part of its suicide-prevention measures, Facebook also is launching a video campaign (including the video at this link) with partner organizations to raise awareness about ways to help people in crisis.
Facebook says it has provided suicide-prevention tools for more than a decade, working with such organizations such as Save.org, National Suicide Prevention Lifeline, Forefront and Crisis Text Line.
“We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide,” the company said in a blog post. “We work to address posts expressing thoughts of suicide as quickly and accurately as possible.”
Facebook, based on feedback from mental-health experts, is testing a “streamlined” reporting process using pattern recognition in posts previously reported for suicide. The AI technology will make the option to report a post about “suicide or self injury” more prominent for posts by users identified as potentially at risk. In addition, Facebook has initiated a limited test in the U.S. of pattern recognition to identify posts as very likely to include thoughts of suicide; those will be reviewed by Facebook’s Community Operations team to proactively provide help.